heywood on 10/12/2019 at 23:13
Quote Posted by Sulphur
I think b) is the key - Google already filters my news according to what it thinks I might be interested in, even if it misjudges the tone and tenor of how I prefer it. The immediate future's going to be very much about machines learning our preferences as demographics and automatically funnelling our preferences back at us while soft censoring perspectives we don't care enough to view. We will be (and perhaps are already) stratifying pretty much on how we want our news delivered to us, and our filters will be intelligences we train or think we're training to give us information at our preferred level of bias.
We're already there. We may not realize we're training machine intelligences when we surf the web or use social media, but we are.
Quote:
Perhaps they'll even re-interpret and reproject information in terms we approve of, given enough time and tweaking - so we can feed our own, personal propaganda back to ourselves.
A reporter working for a wire service may put out a story, but few people will read it straight from the primary source. One person will get the information from an infotainer, another from a newspaper, another from Facebook, NPR, Fox News, etc. All of these secondary sources present stories differently and with varying levels of spin. If the machine can find patterns in what sources you turn to for what types and topics of news, it can present you with a curated feed pulled from all the available sources and stories. That's already happening, and it can be somewhat effective (e.g. YouTube). So we're already driving our own personal propaganda.
I'm not sure I want to be around when the machines at Google (or wherever) learn to take a "base" story and personalize it with just the right amount of spin to appeal to my biases. If they can individualize the propaganda for maximum effect, that gives them huge power.
lowenz on 11/12/2019 at 00:21
Quote Posted by heywood
I'm not sure I want to be around when the machines at Google (or wherever) learn to take a "base" story and personalize it with just the right amount of spin to appeal to my biases. If they can individualize the propaganda for maximum effect, that gives them huge power.
Inline Image:
https://i.pinimg.com/originals/49/67/d6/4967d6d255abe8deb3b8eae9db0aec73.jpg
icemann on 11/12/2019 at 02:52
I feel like im in the Matrix half the time lol.
Renzatic on 11/12/2019 at 02:57
Just half the time? Hell, I woke up in a tub of goo just 15 minutes ago. Dunno how, dunno why, but it happened.
Now, I'm watching Uncle Buck.
lowenz on 11/12/2019 at 08:56
Quote Posted by Renzatic
Just half the time? Hell, I woke up in a tub of goo just 15 minutes ago. Dunno how, dunno why, but it happened.
Now, I'm watching Uncle Buck.
Man, kidney is still there?
Renzatic on 11/12/2019 at 18:52
Part of my liver's gone, but it's cool. They grow back.
Sulphur on 12/12/2019 at 10:13
Quote Posted by heywood
A reporter working for a wire service may put out a story, but few people will read it straight from the primary source. One person will get the information from an infotainer, another from a newspaper, another from Facebook, NPR, Fox News, etc. All of these secondary sources present stories differently and with varying levels of spin. If the machine can find patterns in what sources you turn to for what types and topics of news, it can present you with a curated feed pulled from all the available sources and stories. That's already happening, and it can be somewhat effective (e.g. YouTube). So we're already driving our own personal propaganda.
We always have been even before the internet, since we're creatures driven by confirmation bias. I think that making it an automatic part of our information delivery mechanisms is a fairly dangerous precedent, but it's the things we want or didn't know we want that drives the ad revenue, not the things we don't.
Quote:
I'm not sure I want to be around when the machines at Google (or wherever) learn to take a "base" story and personalize it with just the right amount of spin to appeal to my biases. If they can individualize the propaganda for maximum effect, that gives them huge power.
So I think the future's going to see blowback or at least a movement to open out AI and neural net infrastructure and have a real-time topographic status map of how it's serving us at a high level - bird's eye, maybe - not, perhaps, from the likes of Google, but from an eventual collective. But beyond that speculation, I think we're also fairly malleable in how we choose to believe what we believe: if the base information is given enough spin at the individual level but still remains recognisably the same message to a particular demographic, small differences in opinion can fall away because the delivery mechanism - smartphone AI processing/what have you - can still manufacture the appearance of consent if it permeates all our platforms.
Is that a solution to Kolya's challenge? I don't think so, but it in a twisted way it makes sense. It certainly seems like everything is converging towards this whether we like it or not.
To solve some of the paradoxical problems posed, being recognisable but private is also one of the things Tim Berners-Lee is working on with Solid and his POD specification where end-user data is owned by the end-user, and not farmed by platforms. Part of the answer lies there, I suspect.
demagogue on 12/12/2019 at 15:07
Quote Posted by Kolya
@dema I think I already addressed the natural checks you describe when I mentioned the "market of ideas" and I don't think trial-and-error is a useful strategy here. As an example: If I voted for a shooting every immigrant dead who ever crossed the Aegean from North Africa to Europe - how will I ever notice that this was a bad idea? Where's the natural check? Nothing will fall apart for me. In fact it might make my life simpler. And yet it is obviously a terrible idea.
Ok, we have a concrete example to think about. That's always good. I'll admit I think about these things like a lawyer, granting it's not the only way to think about things. So if there were actual policy orders coming down the pipe saying "shoot immigrants who cross the border" (we start with where the orders start; crazy old men may say it all the time, but it doesn't mean anything until a gov't official repeats it in actual policy), the first thing that would happen in a country like the USA or BRD (but not say China or the old DDR, although you could probably speak to the accuracy of that better than I) is ... well first there are the people standing around the decider, the staff and handlers. They're the first line of defense. In the case of Trump, there are by now dozens (or more?) of documented examples where Trump gave a patently illegal order, and his handlers told him outright that it was illegal. In most cases he still pushes for it, and it's only when they put their foot down and say he just can't, he just can't, that he'll relent and cry about, but that's what he was elected to do, he'll think his staff is sabotaging him, and he'll get new help. But he usually relents when everyone in the inner circle is saying it's just illegal.
Ok, but say the inner circle fails the check and an illegal order comes down the pipe anyway. In the US, the case here are the private contracts for the border detention camps where there have been reports of, well let's say suspicious accidents of people dying by "falls", but take away the spin and we're just talking about blunt force trauma to the head when the family gets the body, and if you're a lawyer you do the math and know what probably happened, but much more prevalent are the stories of separating kids from their parents and the kids not being tended to, getting lice, being all kinds of traumatized because the contractors don't know the first thing about how to administer a proper holding facility, so it ends up operating like a concentration camp whatever you want to call it.
Ok, so in a lot of these kinds of cases, you have the border or police officers being handed those orders facing a dilemma. In China, where they've been herding between 1-3 million people into reeducation detention centers, some of the officers have balked and, at great risk to their lives, leaked documents to the western press so at least they know how bad those detentions really are (in terms of the violations, torture, rape, the fact it's just a year+ long prison sentence for people whose only "crime" was having a beard or wearing a veil and being Muslim). Reminds me of Solzhenitsyn getting the Gulag Archipelago published overseas. It didn't have an effect overnight, but it did have an effect that limited what the USSR could do, and I think the Uyghur Leaks coming out of China may have a similar effect. So that's the next line of defense.
What's happening with the US detention camps though is that, when it's contracted out to these private groups, they're not trained, there's not transparency ... the built-in checks that are supposed to keep violations from happening aren't there. So lo and behold you get violations. That reminds me of the original deep state case, the inner military cabal running Turkey in the 1990s and Pakistan in the 2000s. Originally they were formed as a bulwark against extremist Islamic elements in the government, but they weren't transparent. And in both cases, like clockwork you start seeing massive violations coming out of the underground groups, because impunity sets in where there's not transparency. Same thing with the US blacksites like Abu Ghraib or Guantanamo Bay. When a security agency is made to hide its activities, even if it's originally for "good reasons", you can bet money sooner or later there will be torture and sexual violence coming out of that group. It happens so often and so inevitably it's kind of a law in organization theory.
Ok, so what I meant by natural check is the built-in checks of balking officials, public pressure, legal checks, and the organizational pressure a transparent agency will get when it tries to do something openly illegal. But I tempered that with lots of examples where the natural checks fail and you get violations, and it happens when the checks are explicitly removed, when an executive like Trump gradually ousts each handler that's not a yes-man from his circle, when an agency is contracted to a private security firm, or a military cabal, that hides its activities, etc, etc.
I suppose the classic example is how the Nazis completely subverted every check against abuses in the government over the course of the 1930s. That's an interesting case study for the Trump example because there's been indications that Trump's major model of governance is early national socialism circa 1925 or so.... (Trump long ago admitted that his bedside book is Hitler's early speeches, and some clever journalist did some study into it and claims to have found that Trump has been modeling his own speeches from them. Of course there's also his actual speechwriter and advisor Steve Miller that's pretty transparently pushing 1920s-style Nazi rhetoric. I'm pretty sure he's the one that put the 88/14s in Department of Homeland Security documents.) It's still quite a ways away from the genocidal mania to come, but starting the slow undermining of public institutions and faith in an agreed "truth".
I guess my point is, in established liberal democracies, there are built-in mechanisms that are designed to prevent abuses from happening, and under normal circumstances they can work more or less well against at least the worst kinds of abuses. But they aren't inevitable, and it's quite possible a determined set of people can play the long game of undermining and eventually eroding those kinds of checks over the course of a decade or two. Naturally those of us that care about freedom and rights don't want that to happen, and there are things we can try to do and should do (cf. the UK election happening right now). But before you can do anything, I think first one needs to understand how these things work in both directions (how checks are put in place and operate, and how they are undermined and eroded). Knowledge is power. I could go on, but this is already a long post and I think you all get my gist.
Starker on 12/12/2019 at 18:31
Quote Posted by demagogue
I suppose the classic example is how the Nazis completely subverted every check against abuses in the government over the course of the 1930s. That's an interesting case study for the Trump example because there's been indications that Trump's major model of governance is early national socialism circa 1925 or so.... (Trump long ago admitted that his bedside book is Hitler's early speeches, and some clever journalist did some study into it and claims to have found that Trump has been modeling his own speeches from them. Of course there's also his actual speechwriter and advisor Steve Miller that's pretty transparently pushing 1920s-style Nazi rhetoric. I'm pretty sure he's the one that put the 88/14s in Department of Homeland Security documents.) It's still quite a ways away from the genocidal mania to come, but starting the slow undermining of public institutions and faith in an agreed "truth".
Yup, this guy lays it out pretty well:
[video=youtube;O8UzmLsXGRU]https://www.youtube.com/watch?v=O8UzmLsXGRU[/video]
Kolya on 12/12/2019 at 23:19
Quote Posted by demagogue
Ok, we have a concrete example to think about.
But the point isn't this concrete example (nor Trump). The point is that as long as the (majority of) voters aren't directly disadvantaged by their decision those "natural checks" for bad ideas fail.
- Because people prefer short term gratification over long term sustainability
- Because they choose their own minor advantage over someone else's major loss
- Because the sufferer of the decision is a minority
- Because the sufferer is in another country and cannot participate in the vote
- Because people never know what they missed
This is not an argument against democracy. Although it clearly needs limits, like minority protection. But it is an argument against direct democracy. And against the idea that the best idea will crystallize somehow from a lot of partial interests without any guidance or rules. These rules used to be enforced to a large degree by journalists acting as filters and amps. If we don't want that any more, I think we need something else.