News, tips, advice, support for Windows, Office, PCs & more. Tech help. No bull. We're community supported by donations from our Plus Members, and proud of it
Home icon Home icon Home icon Email icon RSS icon

Blog Archives

  • Apple to enhance Siri privacy protection

    Posted on August 29th, 2019 at 16:05 woody Comment on the AskWoody Lounge

    From Nathaniel Parker:

    Apple has recently made a statement concerning a series of privacy enhancements to Siri as a followup to Apple’s halting of employees listening to Siri requests as part of their “grading” program

    After briefly mentioning how Siri protects customer privacy in its current iteration and a brief description of how Siri’s “grading” program works, Apple issued an apology for how it has not fully communicated the current “grading” program, has reiterated that the current program is now halted, and has also announced that the program will be resumed in the fall after a software update (likely in iOS 13 and the other major Apple operating system updates that utilize Siri).

    When the “grading” program resumes in the fall, the following changes will be made, according to Apple’s statement:

    • First, Apple will no longer retain audio recordings to help improve Siri. Apple will, however, continue to use computer-generated transcripts to help improve Siri.
    • Second, Apple will allow customers to opt-in to help improve Siri by learning from their audio samples. Those who choose to opt-in can also choose to opt-out anytime, and Apple will apply strong privacy controls to this collected data.
    • Third, when customers do opt-in to help improve Siri by learning from their audio samples, only Apple employees (not third-party contractors) will be able to listen to the audio samples. Apple employees will also work to delete audio samples which are determined to inadvertently trigger Siri.

    Two points Apple did not specifically include in the statement are:

    • Whether customers can choose to opt-in or opt-out of allowing Apple to use computer-generated transcripts to help improve Siri. From the reports I have read on other Apple and tech news sites, it sounds as though Apple will continue to use computer-generated transcripts to help improve Siri, without the ability for customer’s to opt-out (although the data should be randomized as to not tie it to a user’s personal information according to Apple’s current iteration of Siri’s privacy protections).
    • Whether customers will need to upgrade to iOS 13 (or the other major Apple operating system updates that utilize Siri) to take advantage of the new “grading” program opt-in. I am concerned especially for those on older Apple hardware that cannot upgrade to the latest operating system updates and wonder if Apple would possibly address such concerns in minor updates to older Apple operating system releases.

    In general, I trust Apple’s privacy stance with Siri more than I do Apple’s competitors.

    With Apple’s competitors such as Amazon (Alexa), all of my Alexa recordings are stored in Amazon’s servers and tied to my Amazon account (although I can delete any of my recordings anytime).

    It is good, however, that Apple is addressing concerns with and is being forthcoming with the current Siri “grading” program and making the necessary adjustments this fall. I hope Apple will clarify the other two points above, and I look forward to seeing how Apple fully rolls out the new privacy enhancements this fall.

    I respect Apple for working hard to keep privacy at the forefront of the customer experience, and it another reason I enjoy using Apple’s products and services.

  • Microsoft contractors listen to some Skype calls

    Posted on August 8th, 2019 at 06:15 woody Comment on the AskWoody Lounge

    Joseph Cox on Motherboard/Vice has a startling discovery:

    Contractors working for Microsoft are listening to personal conversations of Skype users conducted through the app’s translation service, according to a cache of internal documents, screenshots, and audio recordings obtained by Motherboard. Although Skype’s website says that the company may analyze audio of phone calls that a user wants to translate in order to improve the chat platform’s services, it does not say some of this analysis will be done by humans.

    Like other voice snooping revelations in recent weeks — by both Apple and Google — this fiasco appears to be limited to a very small subset of all translated conversations. And it appears to be covered by a tiny clause in a terms of service agreement. But it still makes me wonder what kind of corporate culture allows this sort of thing to happen — not just at Microsoft.

    UPDATE: That was quick… Lexington Law Group is seeking class action status for a lawsuit against Apple for listening to Siri commands. Thx Bogdan Popa.

  • Apple’s revelations about keeping/scanning Siri recordings demand a response

    Posted on July 29th, 2019 at 10:11 woody Comment on the AskWoody Lounge

    Excellent article out this morning from Johnny Evans in Computerworld.

    You may have heard on Friday the Guardian assertion:

    Apple contractors regularly hear confidential medical information, drug deals, and recordings of couples having sex, as part of their job providing quality control, or “grading”, the company’s Siri voice assistant

    For a company that touts its privacy superiority, that’s clearly way over the line. Even I was shocked – and I’ve been jaded by years of Microsoft’s snooping.

    This morning, Johnny Evans published a clear plan for fixing the wrongs:

    • Apple should introduce much clearer and easier to understand privacy warnings around use of Siri on its devices.
    • When setting up Siri on a new device you as a user should be given the chance to explicitly reject use of your voice for any purpose other than the original request.
    • Apple should bring this [contracted human snooping] work in-house, become completely accountable for what its voice workers and management do with these recordings, and ensure customers have some way in which to punish any infraction of their data privacy.
    • In the event Siri is invoked but no specific request is made, the system should be smart enough to ignore the interaction and delete any recording made as a result of that interaction.
    • Only in those instances in which different voice recognition systems can’t find a way to agree on what is said should human ears be necessary.

    It’s an excellent article. Windows users take note.

  • Cimpanu: The US Govt Accountability Office recommends the US adopt GDPR-like privacy legislation

    Posted on February 17th, 2019 at 07:48 woody Comment on the AskWoody Lounge

    It’s about time.

    Catalin Cimpanu, writing for ZDNet, has some great news:

    An independent report authored by a US government auditing agency has recommended that Congress develop internet data privacy legislation to enhance consumer protections, similar to the EU’s General Data Protection Regulation (GDPR).

    The report (PDF) recommends:

    Congress should consider developing comprehensive legislation on Internet privacy that would enhance consumer protections and provide flexibility to address a rapidly evolving Internet environment. Issues that should be considered include what authorities agencies should have in order to oversee Internet privacy, including appropriate rulemaking authority.

    It’s about time. (Although it looks like the report was published on January 15. First I’ve heard of it.)

  • New, improved privacy in Win10 1803 may not be what you think

    Posted on January 31st, 2018 at 05:32 woody Comment on the AskWoody Lounge

    I’ve been reading the wave of mainstream articles that followed Marisa Rogers’s publication last week of an official Microsoft notice about new privacy features in the next version of Windows.

    Being the skeptic that I am, the articles sounded to me like Microsoft Press Releases bouncing around the blogosphere — long on accolades, short on real-world experience. Sadly, we’re seeing a whole lot of “reporting” like that these days.

    So it heartens me to see a hard-boiled look at the new feature, from my old friend Preston Gralla. In his Computerworld opinion piece Don’t believe Microsoft’s latest privacy hype, Gralla hit it right on the nose:

    Microsoft got plenty of kudos for the new tool. For the company, that was mission accomplished. But it was anything but that for users. The Diagnostic Data Viewer is a tool that only a programmer could love — or understand. Mere mortals, and even plenty of programmers, will be baffled by it, and they won’t gain the slightest understanding of what data Microsoft gathers about them.

    His conclusion:

    Microsoft should change this. It should release a simple-to-use tool that shows in granular detail and in plain English exactly what diagnostic information is being sent to Microsoft. People should then be allowed to opt in or out for every type of diagnostic information that is sent. And everyone should be able to do that, not just those who have a specific version of Windows 10.

    With the EU apparently poised to do some real privacy protection — I’m not talking about the glossy installation switches in Win10 1703 and later, which are all hat and no cattle — the topic’s going to get heated in the next few months.

    If you want to know the real, nitty-gritty story on Win10 privacy — which settings do what, and how it all fits together — take a look at Martin Brinkmann’s The Complete Windows 10 Privacy Guide: Windows 10 Fall Creators Update version. I have a link to it over on the right side of this page.

    That’s the meat. Don’t settle for the sizzle.

  • Privacy, Apple style

    Posted on September 28th, 2017 at 11:12 woody Comment on the AskWoody Lounge

    Apple may be hobbling its advertising revenue by its strict adherence to privacy-friendly policies, but it’s getting a lot of respect at the same time.

    Read about it in Rene Ritchie’s article in iMore:

    Apple, because of its own business model, has no need to persist our data, our behavior, and our relationships on its servers. What’s more, by virtue of the company’s belief in privacy and security, it wants no part of our data. Instead, it encrypts our data end-to-end in transit and purges it as quickly as is practical.

    I’m not saying it’s perfect. I’m saying it sounds better than Google and Microsoft.

  • Google will no longer scan Gmail to serve up personalized ads

    Posted on June 23rd, 2017 at 17:14 woody Comment on the AskWoody Lounge

    It’s a remarkable development. Martin Binkmann at gHacks reports that Google’s giving up on its email scanning.

    Wouldn’t surprise me a bit if EU privacy concerns prompted the move. Whatever the impetus, the result is important. Google stopped scanning email in paid accounts and educational accounts years ago. This is another step in the right direction.

    Far as I know, Microsoft still scans the subject line in free Hotmail/Outlook.com mail accounts.

  • HIPAA compliance using Win10 Enterprise

    Posted on April 14th, 2017 at 06:40 woody Comment on the AskWoody Lounge

    Here’s an excellent article about walking the thin line between modern technology and HIPAA (think: keeping private information private in the US — if that isn’t an oxymoron). From HIPAA One, Steven Marco, Arch Bear, and Markus Muller have put together an insightful analysis. From the introduction:

    In today’s computing environment, record-breaking data breaches (e.g. Premera Blue Cross with 11+ Million members breached in 2015) that include healthcare identity theft have increased by over 20% year-over-year between 2012 and 2014

    1. It is no surprise most of us feel we have lost control of our personal data

    2 . This is especially true in the healthcare industry in the form of data breaches and HIPAA Privacy violations.

    Simultaneously, massive populations of users are fully-embracing new mobile applications to store and share data across platforms. As a result, cloud computing has bridged the gap between consumer devices and sensitive data. Is there a price to pay for our love affair with cloud-based apps and mobile devices?

    As a cloud-based technology user, have you ever wondered about the safeguards protecting your personal and health information? Ever contemplated how modern operating systems like Google Android, Apple iOS and Microsoft Windows 10 access your data to provide cloud
    powered features?

    For example, Siri, the Dragon dictation cloud, Google Voice search and Docs all send voice recordings to the cloud and back while other built-in OS features share contacts between apps. How do these cloud-powered features impact these risks?

    If a medical facility utilizes voice-to-text technology (e.g. by saying “Hey Cortana”, “Siri” “OK Google”, or “Alexa”) to dictate notes about a patient, that information is automatically exchanged with the cloud. Without a business associate agreement, that medical facility could
    face a HIPAA violation. How do we combine the past 30 years of email-use, file and print sharing with today’s cloud-enabled apps securely?

    These questions and concerns are currently top-of-mind for IT and legal professionals responsible for managing electronic Protected Health Information (ePHI) while ensuring and maintaining HIPAA compliance. In light of the recent focus on HIPAA enforcement actions, hospitals, clinics, healthcare clearinghouses and business associates are trying to understand how to manage modern operating systems with cloud features to meet HIPAA regulatory mandates. Additionally, many of these healthcare organizations are under pressure to broadly embrace the benefits of cloud computing.

    Microsoft has invested heavily in security and privacy technologies to mitigate today’s threats.

    Lounger zero2dash, who posted the original link to this story, says:

    They configured the heck out of 10 AU Enterprise to not phone home, and it did it anyway. Very interesting to see all the settings they tweaked in GP but still saw all the traffic going to MS.

    Having to deal with PCI Compliance is bad enough for me; I’m glad I don’t have to try to keep our environment HIPAA compliant.

    Well worth reading (PDF).