Newsletter Archives
-
The right to be sued
ISSUE 20.19 • 2023-05-08 LEGAL BRIEF
By Max Stul Oppenheimer, Esq.
Law students are sometimes puzzled by the section of the Corporations Statute saying that corporations have the right to be sued.
Why, they wonder, would anyone want to be sued? Wouldn’t it be better to have the right not to be sued?
The answer is subtle.
Read the full story in our Plus Newsletter (20.19.0, 2023-05-08).
This story also appears in our public Newsletter. -
The Three Laws of Robotics
ISSUE 20.15 • 2023-04-10 COMMENTARY
By Will Fastie
Along with its recent announcement of Copilot, Microsoft made a point of mentioning “responsible AI.”
Undoubtedly, part of the reason for bringing the matter up was the almost instant controversy surrounding Bing AI, Microsoft’s integration of its AI engine into Bing and Edge, especially its apparently threatening behavior toward a reporter.
What does “responsible AI” mean?
Read the full story in our Plus Newsletter (20.15.0, 2023-04-10).
This story also appears in our public Newsletter. -
How to get the most out of Google Bard and Bing Chat
PUBLIC DEFENDER
By Brian Livingston
Whether you like it or not, generative artificial intelligence — gen AI — is making its way into Google’s search engine, Gmail, and Docs. Its Redmond competitor is building OpenAI’s GPT-4 into Microsoft 365, Outlook, Word, Excel, PowerPoint, and more.
Other tech companies don’t want to go down the tubes like buggy whips, slide rules, and Silicon Valley Bank. So they’re all racing against the two software giants to bolt gen AI into their apps and gadgets. Who knows, you might get AI-driven responses from holographic humanoids in Facebook’s failed virtual metaverse.
Read the full story in our Plus Newsletter (20.14.0, 2023-04-03).
-
You’re fired if you don’t know how to use GPT-4
ISSUE 20.13 • 2023-03-27 PUBLIC DEFENDER
By Brian Livingston
Mainstream media outlets are ablaze with news about GPT-4, OpenAI’s enormously powerful artificial-intelligence engine that will soon be shoehorned into every nook and cranny of Microsoft 365. Suddenly, knowing how to “prompt” (program) a generative AI app has become an essential requirement for your job or your life. God help us.
We’ve all been instantly transported into the 25th-century world of Star Trek’s Jean-Luc Picard. You may think we’re still in the year 2023. But now — by entering just a few words — you can propel your personal starship through the galaxy at Warp 9. Or you can remain stuck in place and be assimilated by the Borg. Resistance is futile.
Read the full story in our Plus Newsletter (20.13.0, 2023-03-27).
This story also appears in our public Newsletter. -
Microsoft 365 Copilot announced
MICROSOFT NEWS
By Will Fastie
Another shoe has dropped in Microsoft’s quest to build generative AI into just about everything.
On March 16, Microsoft announced a new feature named Copilot, which will be provided with Microsoft 365 accounts. Copilot uses generative artificial intelligence, a large language model (LLM), and the Microsoft Graph to provide services based on natural language prompts. Examples of such prompting are evident in the controversial BingAI, but Microsoft emphasized its commitment to “responsible AI” for Copilot.
Read the full story in our Plus Newsletter (20.13.0, 2023-03-27).
-
Can chatbots write code? Yes, but should you use it?
PUBLIC DEFENDER
By Brian Livingston
You’ve heard a lot about “chatbots” that can answer search queries and write essays that sound like some human had written them. But did you know that artificial-intelligence agents, if you ask nicely, can also write working computer code?
Bots that can code include OpenAI’s ChatGPT, which Microsoft has tried — with great frustration — to integrate into its Bing search engine. (See my February 27, 2023, column on the freakouts Redmond’s AI has subjected users to.)
Automated tools like ChatGPT have spawned a lot of hysterical commentary, such as “The sky is falling and human programmers are doomed.” But don’t quit your day job just yet.
Read the full story in our Plus Newsletter (20.10.0, 2023-03-06).
-
How to jailbreak ChatGPT and Bing AI — to see their evil twins
PUBLIC DEFENDER
By Brian Livingston
The world has gone gaga for chatbots: text-based artificial intelligence apps like Open AI’s ChatGPT — which Microsoft is using for its new, gabby Bing AI.
The power of these bots, which converse in a frighteningly human-like way, may be the greatest technology breakthrough since Gutenberg invented movable type, eliminating the tedious hand-copying of manuscripts.
However, that’s like saying the invention of the electric chair was a great advance for criminal justice over the older guillotine technology.
Read the full story in our Plus Newsletter (20.09.0, 2023-02-27).
-
How you can use generative AI images — legally
PUBLIC DEFENDER
By Brian Livingston
Suddenly, image-creation programs based on generative artificial intelligence (gen AI) have exploded into one of the most exciting tech breakthroughs we’ve seen in years.
A big problem is that the developers of these capabilities — and perhaps you, if you use them — are being sued big-time. The charges are copyright infringement and theft of intellectual property from the artists whose works were “scraped” off the Internet to train the systems.
Read the full story in our Plus Newsletter (20.08.0, 2023-02-20).
-
You are rich!
ISSUE 20.07 • 2023-02-13 LEGAL BRIEF
By Max Stul Oppenheimer, Esq.
You may not realize that you have a valuable asset, one that companies are willing to pay big bucks for.
Unfortunately (for you), to date those companies have paid the big bucks to someone else: $76 billion to Google in the last quarter of 2022, $31 billion to Amazon for advertising alone last year. Those companies (and others) are selling access to your personal information. There is nothing illegal about that — you willingly (although perhaps not purposely) provided them the information.
Read the full story in our Plus Newsletter (20.07.0, 2023-02-13).
This story also appears in our public Newsletter. -
Is the woman in this video real or a deepfake? Now find out.
PUBLIC DEFENDER
By Brian Livingston
There’s been an explosion in artificial intelligence (AI) that can create fake videos and compose passable writing samples. These computer-generated outputs are now good enough to fool the average person, who may absorb social media with an uncritical eye.
The major media have exhaustively (but superficially) written about these AI programs. So I won’t bore you with the mind-numbing details of exactly how they work.
Instead, I’ll tell you how to detect them and — hopefully — protect yourself against fakes of all kinds.
Read the full story in our Plus Newsletter (20.04.0, 2023-01-23).
-
Note to Congress: Please try to keep up
LEGAL BRIEF
By Max Stul Oppenheimer, Esq.
That’s a big ask.
In a previous column, I explained why law always lags technology.
To summarize, case law is by definition reactive. Courts don’t go out and look for cases; they wait for someone to be upset enough to bring one to them. Legislatures can be proactive, but they can’t act until they realize there’s a problem to be acted upon. So it is not surprising that new issues will arise, and we will need to be patient while solutions are agreed upon.
But …
Read the full story in our Plus Newsletter (20.02.0, 2023-01-09).
-
Viva Microsoft!
One day in what seems the distant past, Microsoft Cortana starting sending me email messages. I didn’t ask for them; they just started showing up. For the most part, I found them useless. I think there were some “mental health” messages. My wife thinks I’m crazy; I don’t need some server in Redmond telling me that.
One day in what I’m sure is the recent past, the new Microsoft Viva took over that responsibility. This time, however, the focus appears to be on actionable items. Cortana/Viva is watching me, picking out what it thinks are to do items, and reminding me about them. I’ve been a happy camper for years because while it was apparent that Google was reading my Gmail, Microsoft wasn’t. Now it clearly is.
But, okay, reminders. I handle a lot of email in my editorial duties; some assistance might be helpful. So, I’ve been experimenting.
More often than not, clicking one of the action buttons in the email results in the teeny, tiny error message shown at the top.
Can’t Microsoft code any longer?