newsletter banner

ISSUE 20.35.F • 2023-08-28 • Text Alerts!Gift Certificates
You’re reading the FREE newsletter

Susan Bradley

You’ll immediately gain access to the longer, better version of the newsletter when you make a donation and become a Plus Member. You’ll receive all the articles shown in the table of contents below, plus access to all our premium content for the next 12 months. And you’ll have access to our complete newsletter archive!

Upgrade to Plus membership today and enjoy all the Plus benefits!

In this issue

LEGAL BRIEF: A t-AI-l of two cities

Additional articles in the PLUS issue

WINDOWS 11: SMB security changes in Windows 11 might affect your NAS, too

FREEWARE SPOTLIGHT: DeviceRescue — So long, Device Manager

SOFTWARE: Ready to retire your servers?


ADVERTISEMENT
Hive
We help teams move faster

The first project management platform built for users by users.


LEGAL BRIEF

A t-AI-l of two cities

Max Oppenheimer

By Max Stul Oppenheimer, Esq.

Generative artificial intelligence is the latest of the “we’ve never seen anything like this before — something must be done” technologies.

Earlier examples are biotechnology, the personal computer, the video recorder, the record player, and probably fire — none of which caused the world to end.

Multiple players are deciding their opening moves in reacting to the sudden entry of this technology into the public consciousness. Not surprisingly, their approaches differ because their interests differ. That certainly applies to the Microsoft Services Agreement.

It was the most complicated of times …

In the city of Redmond, Washington, Microsoft needed to decide how to handle generative public AI. On the one hand, this technology offers great opportunities for expanding the use of (and therefore revenues from) its existing products; on the other hand, it poses risks that the tool might be used to create competitive products (possibly even using Microsoft’s own technology). Liability resulting from its use is unclear.

The public face of Microsoft’s opening move is the modification of its agreement(s) with its users — the Microsoft Services Agreement (MSA), including its multiple subparts.

Caveats

Before reviewing the changes to the MSA, a few warnings are in order.

1. The link to the MSA above presents the English-language version. Clicking multiple times on “English” will not make it any clearer.

2. Roughly five percent of the world’s oceans have been explored, a very small portion. That’s probably true for the MSA as well. It is a complicated document, incorporating other documents by reference. A simple request to Microsoft (“How many pages is the complete agreement?”) went unanswered. We asked ChatGPT to compare the current version of the MSA with the new version. It responded: “The message you submitted was too long, please reload the conversation and submit something shorter.”

3. Therefore, this article looks only at what Microsoft says it is changing. Looking at Change #1 (“In the header, we updated the publication date to July 30, 2023, and the effective date to September 30, 2023”) might lead you to think, “Well, if they’re listing changes that insignificant, surely all the important changes are listed in the document.” But be warned: the summary page states, “This page provides a summary of the most notable changes to the Microsoft Services Agreement. To see all of the changes, please read the full Microsoft Services Agreement here.” (The bold type is in the original. That usually means it’s important.) If you follow the link, you are told that the average reading time is 55 minutes. Go ahead, I’ll wait.

Back so soon? Let’s just assume that neither of us has read the full agreement and that not all changes are discussed in the summary, or here.

With those caveats, what does Microsoft say it has changed?

The list

Here’s the list, somewhat abbreviated.

  1. In the header, we updated the publication date to July 30, 2023, and the effective date to September 30, 2023.
  2. In the Your Privacy section, we expanded the definition of “Your Content” to include content that is generated by your use of our AI services.
  3. In the Code of Conduct section, we added verbiage to govern the use of AI services.
  4. In the Using the Services & Support section, we added a Moderation and Enforcement section to clarify and help users better understand these practices
  5. In the Service-Specific Terms section, we made [a bunch of product-specific changes and … ]
    1. We added a section on AI services to set out certain restrictions, use of Your Content and requirements associated with the use of the AI services.
  6. In the Notices section, we made edits to update the notice status of certain licenses and patents.
  7. Throughout the Terms, we made changes to improve clarity and to address grammar, typos, and other similar issues. We also updated naming and hyperlinks.

Sounds pretty innocuous. We’ve already discussed Item 1 — no biggie. Item 2 sounds like Microsoft is giving its users something. Item 3 just adds “verbiage,” a French term that implies “please don’t look here.” Items 4 and 7 are just clarifications to help users understand. Item 6 looks unimportant. This leaves only Item 5(e) as something that sounds like it might be taking something away from users.

Just for fun — and not out of lack of trust — let’s try redlining the new policy.

Whoa! Look at all that red! In fairness, most of it is just updating references, as advertised. But one item not mentioned in the summary is the change to the Privacy Policy, and several of its listed changes deserve explanation.

Defining content: A bold new definition of “yours”

As before, Microsoft has staked out certain rights regarding “your” content, defined as:

… your communications with others; postings submitted by you to Microsoft via the Services; and the files, photos, documents, audio, digital works, livestreams and videos that you upload, store, broadcast, create, generate, or share through the Services or inputs that you submit in order to generate content.

That last item — arguably redundant — appears to be targeted directly at prompts submitted to whatever generative AI Microsoft plans to make available. This becomes important when we get to the section outlining Microsoft’s rights to “your” content.

The seemingly good news is that “We don’t claim ownership of Your Content. Your Content remains yours and you are responsible for it.” In case that second sentence is unclear, the policy continues: “You represent and warrant that … you have … all the rights necessary for Your Content.” Although that may sound like a positive, it is the prelude to “… and that the collection, use, and retention of Your Content will not violate any law or rights of others. Microsoft cannot be held responsible for Your Content or the material others upload, store or share using the Services.” In other words, if your content violates anyone’s rights, it’s your problem, not Microsoft’s.

That might seem fair and reasonable if you controlled how your content was used. But the MSA continues:

To the extent necessary to provide the Services to you and others, to protect you and the Services, and to improve Microsoft products and services, you grant to Microsoft a worldwide and royalty-free intellectual property license to use Your Content … Your Content may appear in demonstrations or materials that promote the Service.

I added the bold for emphasis, and I wonder whether using “your content” to train its AI is permitted by the MSA. And if so, and if the resulting product creates liability, might that be attributed to your content, which “Microsoft cannot be held responsible for….”?

Microsoft takes additional rights in the Service Specific provisions, for example (regarding Xbox):

You grant to Microsoft, its affiliates and sublicensees a free and worldwide right to use, modify, reproduce, distribute, broadcast, share and display Your Content or your name, gamertag, motto, or avatar that you posted for any Xbox Services.

Note that you have no control over who sublicensees are or how they might use your content.

Microsoft also has a specific section relating to AI — including provisions prohibiting you from doing what they’re doing:

Extracting data — Unless explicitly permitted, you may not use Web scraping, Web harvesting, or Web data-extraction methods to extract data from the AI services.

Limits on use of data from the AI services — You may not use the AI services, or data from the AI services, to create, train, or improve (directly or indirectly) any other AI service.

Use of your content — As part of providing the AI services, Microsoft will process and store your inputs to the service as well as output from the service, for purposes of monitoring for, and preventing, abusive or harmful uses or outputs of the service.

Third-party claims — You are solely responsible for responding to any third-party claims regarding your use of the AI services in compliance with applicable laws (including, but not limited to, copyright infringement or other claims relating to content output during your use of the AI services).

A physical-world analogy might be: “I’m giving you a car! It’s your car!! You need to insure it, maintain it, pay for gas (or electricity), repair any damage, and pay any taxes because it’s your car.” Makes sense. “Oh — one more thing. I get to use it whenever and however I want. But, because it’s your car, if my use causes any harm to anyone, you’ll be responsible for that. You’ll also be responsible for any speeding or parking tickets I may collect. By the way, that applies not just to my use, but to anyone else I allow to use it — and I plan to let millions of people use it. And I plan to charge them for it. Enjoy your new car! Here’s one set of keys; I’ll keep the other.”

The Revised Code of Conduct: Refreshing honesty or “verbiage” doesn’t mean what they think it means?

In the summary of changes, we’re advised that “In the Code of Conduct section, we added verbiage to govern the use of AI services.” Brittanica defines “verbiage” as “speech or writing that contains too many words or that uses words that are more difficult than necessary.”

Let’s assume that Microsoft is being honest. (Surely it still has a copy of Microsoft Bookshelf [1993 edition] lying around. Its definition is pretty much the same as Britannica’s, and it adds this interesting bit of color: the word stems from the Old French “verbier,” meaning “to chatter.”)

The new policy prohibits a number of things, including activities using generative AI:

Don’t publicly display or use the Services to generate or share inappropriate content or material.

Don’t violate or infringe upon the rights of others (e.g., unauthorized sharing of copyrighted music or other copyrighted material, resale or other distribution of Bing maps, or taking photographs or video/audio recordings of others without their consent for processing of an individual’s biometric identifiers/information or any other purpose using any of the Services).

To be honest, I don’t know for sure what either of these provisions means. Nice verbiage!

Why?

Microsoft wants to build and sell tools and products based on generative artificial intelligence. To do so, it needs access to training materials. The largest source of those materials is content posted on the Web. Under US law, original material becomes copyrighted as soon as it is “fixed in a tangible medium,” which courts have concluded includes posting online. With certain exceptions, copyright owners are entitled to compensation for use of their copyrighted material. There are current court cases that will turn on whether material posted on the Web falls within one of those exceptions, but at the moment it’s risky to assume that they will.

But there’s another solution. Copyrights are personal property, and therefore copyright owners are free to sell or license access to and use of their copyrights. Microsoft wants you to do just that. The MSA agreement is a contract in which Microsoft grants you the right to use their product and, in exchange, you grant them the right to use your content.

Microsoft’s interests lie in building a competitive generative AI product. It could buy — or pay to create — content. But its bottom line will be better if it can get it for free. And its millions of users are creating massive content daily. Microsoft is using the attraction of its products (which presumably will get even better, thanks to your content) to get you to agree, contractually, to let it use your content.

It also wants to prevent others from using its services or content to build a competitive product.

And, finally, it does not want to be held liable if your content causes trouble – for example, by violating someone’s copyright or providing harmful false information.

Will it work?

The MSA won’t work in all cases. In order to enter a contract, the parties must be “competent,” a legal term that includes, for example, being of legal age. So a five-year old could click “OK” but not be legally bound. That’s the reason Microsoft includes a warranty. (Although, if the user can’t enter the contract, one wonders whether the warranty would be enforceable — but there’s only so much a lawyer can do.)

Microsoft has powerful weapons at its disposal. If you violate its terms, it can take actions including “limiting access to certain features or Services, stopping providing Services, [and] closing your Microsoft account immediately.” In case you didn’t realize it, “Closure of your access to a Service or your account may result in forfeiture of content licenses, [and] associated content … .” Be sure you keep backups.

What about your rights?

It’s pretty clear that your rights are not high on Microsoft’s priorities. Its objectives are twofold: getting access to content and training materials for its new AI products while preventing competitors from doing the same, and trying to shield itself from liability from the growing number of lawsuits over AI.

You probably don’t have the bargaining power to negotiate a better deal.

Expect other companies to follow suit; expect the agreements to get increasingly one-sided and more complicated as new threats are perceived. If you’re going to get a better deal, someone with a lot more power will need to take up your cause.

That’s the view from Redmond. Next up: the view from Washington, DC.

Talk Bubbles Join the conversation! Your questions, comments, and feedback
about this topic are always welcome in our forums!

Max Stul Oppenheimer is a tenured full professor at the University of Baltimore School of Law, where he teaches business and intellectual property law. He is a registered patent attorney licensed to practice law in Maryland and DC. Any opinions expressed in this article are his and are not intended as legal advice.


ADVERTISEMENT
WildGrain


Here are the other stories in this week’s Plus Newsletter

WINDOWS 11

Author

SMB security changes in Windows 11 might affect your NAS, too

By Mary Branscombe

It’s going to get harder and harder to connect to your NAS as a guest with SMB. That’s a good thing for security, but it could be a problem if your hardware is older.

The Server Message Block (SMB) network file-sharing protocol lets Windows applications read and write files stored on servers in your network, including Network Attached Storage (NAS) systems. SMB underpins a lot of Windows network technologies, such as Storage Spaces Direct and even network printing. The print spooler is essentially just a file, after all.

FREEWARE SPOTLIGHT

Deanna McElveen

DeviceRescue — So long, Device Manager

By Deanna McElveen

As frustrating and non-frustrating changes happen to Windows over the years, I wonder whether Microsoft just forgets about Device Manager.

I doubt that 1998 Deanna (better hair) would find much difference between the Windows Device Manager in Windows 98 and Windows 10 or 11.

The differences are primarily cosmetic, matching the overall changes in the Windows UI design. The functionality has not changed much in 25 years.

SOFTWARE

Susan Bradley

Ready to retire your servers?

By Susan Bradley

Last week, I urged users to review their software needs and consider options other than sticking with outdated and unsupported operating systems for their personal PCs.

This week, I’m going to discuss what businesses need to think about when it comes to server operating systems. They have a lifecycle that should be reviewed, just like workstations. For example, October represents the end of the road for Server 2012 and 2012 R2. It’s time to look ahead and plan accordingly.


Know anyone who would benefit from this information? Please share!
Forward the email and encourage them to sign up via the online form — our public newsletter is free!


Enjoying the newsletter?

Become a PLUS member and get it all!

RoboForm box

Don’t miss any of our great content about Windows, Microsoft, Office, 365, PCs, hardware, software, privacy, security, safety, useful and safe freeware, important news, analysis, and Susan Bradley’s popular and sought-after patch advice.

PLUS, these exclusive benefits:

  • Every article, delivered to your inbox
  • Four bonus issues per year, with original content
  • MS-DEFCON Alerts, delivered to your inbox
  • MS-DEFCON Alerts available via TEXT message
  • Special Plus Alerts, delivered to your inbox
  • Access to the complete archive of nearly two decades of newsletters
  • Identification as a Plus member in our popular forums
  • No ads

We’re supported by donations — choose any amount of $6 or more for a one-year membership.

Join Today buttonGift Certificate button

The AskWoody Newsletters are published by AskWoody Tech LLC, Fresno, CA USA.

Your subscription:

Microsoft and Windows are registered trademarks of Microsoft Corporation. AskWoody, AskWoody.com, Windows Secrets Newsletter, WindowsSecrets.com, WinFind, Windows Gizmos, Security Baseline, Perimeter Scan, Wacky Web Week, the Windows Secrets Logo Design (W, S or road, and Star), and the slogan Everything Microsoft Forgot to Mention all are trademarks and service marks of AskWoody Tech LLC. All other marks are the trademarks or service marks of their respective owners.

Copyright ©2023 AskWoody Tech LLC. All rights reserved.