Welcome to NexusFi: the best trading community on the planet, with over 150,000 members Sign Up Now for Free
Genuine reviews from real traders, not fake reviews from stealth vendors
Quality education from leading professional traders
We are a friendly, helpful, and positive community
We do not tolerate rude behavior, trolling, or vendors advertising in posts
We are here to help, just let us know what you need
You'll need to register in order to view the content of the threads and start contributing to our community. It's free for basic access, or support us by becoming an Elite Member -- see if you qualify for a discount below.
-- Big Mike, Site Administrator
(If you already have an account, login at the top of the page)
The tech industry, not surprisingly, discovered that they could put anything in their stated policies and no one would read them. They are not the first -- that's where the term "fine print" came from, probably hundreds of years ago.
The producers of browser extensions know all about this; plus, they (or some of them) apparently just violate their stated terms and expect to get away with it.
If that's all it takes to get their hands on their customers' data, the choice is easy. You can sell that personal data, after all.
Workers hear drug deals, medical details and people having sex, says whistleblower
Apple contractors regularly hear confidential medical information, drug deals, and recordings of couples having sex, as part of their job providing quality control, or “grading”, the company’s Siri voice assistant, the Guardian has learned.
Although Apple does not explicitly disclose it in its consumer-facing privacy documentation, a small proportion of Siri recordings are passed on to contractors working for the company around the world. They are tasked with grading the responses on a variety of factors, including whether the activation of the voice assistant was deliberate or accidental, whether the query was something Siri could be expected to help with and whether Siri’s response was appropriate.
Apple says the data “is used to help Siri and dictation … understand you better and recognise what you say”.
But the company does not explicitly state that that work is undertaken by humans who listen to the pseudonymised recordings.
Apple told the Guardian: “A small portion of Siri requests are analysed to improve Siri and dictation. User requests are not associated with the user’s Apple ID. Siri responses are analysed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.” The company added that a very small random subset, less than 1% of daily Siri activations, are used for grading, and those used are typically only a few seconds long.
A whistleblower working for the firm, who asked to remain anonymous due to fears over their job, expressed concerns about this lack of disclosure, particularly given the frequency with which accidental activations pick up extremely sensitive personal information.
Siri can be accidentally activated when it mistakenly hears its “wake word”, the phrase “hey Siri”. Those mistakes can be understandable – a BBC interview about Syria was interrupted by the assistant last year – or less so. “The sound of a zip, Siri often hears as a trigger,” the contractor said. The service can also be activated in other ways. For instance, if an Apple Watch detects it has been raised and then hears speech, Siri is automatically activated.
The whistleblower said: “There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on. These recordings are accompanied by user data showing location, contact details, and app data.”
Last night I watched The Great Hack, (trailer here)
While for the past 10 years I have had plenty of time to be disgusted about the themes of privacy invasion being talked about in this documentary, I couldn't help but feeling a sense of dread because of the many threats to democracy around the world.
I found online a transcript of the documentary and, to give readers an idea of the impact that this stuff has had, I'm going to quote a passage from it.
Trading: Primarily Energy but also a little Equities, Fixed Income, Metals, U308 and Crypto.
Frequency: Many times daily
Duration: Never
Posts: 5,059 since Dec 2013
Thanks Given: 4,410
Thanks Received: 10,226
I watched it recently as well. While the subject matter is obviously terrifying, I thought the film itself was poor. Rather than being an in-depth documentary I think it was something designed to do little more than shock the uniformed. I say the uniformed because anybody who knew much on the subject already, probably learnt little that was new. It was slow, boring and little more than Brittany Kaiser trying to save face, something she failed to do when it came out that she had also met with both Assange and the Russians something she never openly revealed. The Carroll & Cadwalladr elements added little also. If Carroll had actually got his data, and we could see what it was, where it came from and how it was used it would have been interesting, but he didn’t so it wasn’t. Instead a large part of the movie was wasted just to find out he never got his data! Where was the discussion on how they used the data, how does this weapons grade targeting actually work? I would also have like to see more of Julian Wheatland and Alexander Nix. As Wheatland basically said, this is going on, CA were just the ones that got caught!
Good points - I still think it's worthwhile for the people who might still need to be educated about how social media today invades people's privacy de facto.
As a parallel to what's going on with the environment, where I am starting to see a lot more people talking about it - whether we are doing something useful about it is another matter but we are talking about it and that's progress - I think we are a few years behind the environment with the privacy conversation, but this sort of documentary is useful to get us to talk about it. At least.