tv All Hail The Algorithm Read Me Or Just Tap I Agree Al Jazeera November 27, 2020 2:30am-3:01am +03
2:30 am
to help others are among the messages written on the flags, a few photographs. but the most common expression of grief is we miss the talks are going to al-jazeera washington d.c. . this is al jazeera and these are the top stories. argentinians have been paying tribute to football legend, diego maradona, who died this week at the age of 60. thousands of people lined the streets of the capital as his body was transported from the presidential palace to a cemetery. a country is now in the midst of 3 days of mourning is following developments for us from what is aires. his family, they've dictated the terms of this very traumatic day, which many arjan times wanted to share. and i think some of that frustration spilling over into violence earlier in the day when thousands of people have been
2:31 am
queuing up to get into the presidential palace to pay their last respects. we're told they were going to make it the family wanted, the gates closed, they wanted the body taken to the cemetery, the same cemetery. where do you go matters that his parents are buried? he's being buried now as we speak alongside them, but that frustration spilling over the able, they say, has been a very traumatic day for millions of arjun signs. well, astra zeneca is now considering running another global trial after it admitted its current. a virus vaccine needs more research. the oxford university which partnered with the pharmaceutical fam says some of the trial injections didn't have the right amount of vaccine due to a manufacturing area. in germany is imposing stricter lockdown measures as it struggles to suppress coronavirus infections. chancellor angela merkel says the restrictions are likely to continue into january, and she's also pushing for the e.u. to close down ski resorts over the winter holidays. the ethiopian prime minister has ordered a final offensive on the northern tikrit region and says,
2:32 am
a humanitarian corridor will be opened to help those fleeing the conflict. and has warned residents to stay indoors after ordering the army to move in on the regional capital nicolay. this all follows nearly 3 weeks of fighting with hundreds killed and tens of thousands displaced. begin a faster as president rocket bore a has promised to move the country forward after he was reelected for another 5 year term. the electoral commission says could already has secured 1600000 votes of the nearly 3000000 cast with voter turnout at 50 percent. but opposition parties say the vote was marked by fraud. well, those are the headlines. i will have another news update for you here on al-jazeera . after all, hail the algorithm to stay with us. american people have finally folk and america as i see it when america is all balanced or will become more dangerous. the world is looking at us in the next year of sadness may be with the election behind us.
2:33 am
will the republican party dumptruck the feel we can take on us politics and society? that's the bottom line. there is a huge group of people at work behind our screens. the cold behavior architects, dissuasive designers or user experience specialists and the power they have is massive. that urge to keep swiping through a twitter feed that's designed the way we all click. i agree to the terms and conditions that's designed, swiping left or right on 1000 odd that's designed to we live in an online world of someone else's making. and most of us never even give it a 2nd thought. and actually, that's design as well. san
2:34 am
francisco, it's the mecca for tech designers. silicon valley. this place pioneered the art of constructing, optimizing and enhancing a lot of the technology we use every day. it's turbo charge, the speed at which we use the internet and made navigating the way more ensuring. but it's also given us a full sense of security. i've lost count of the number of times i've clicked. i agree to get into a website. we all have to do it. as we speed around the internet, we face hundreds of these annoying pop ups and consent forms all demanding a response from us. and yes, i call them annoying because that is exactly what they are. they may look like that they're there to provide us with control. but the reality is far from it. when users click on, i agree to the terms and conditions that they see
2:35 am
a privacy policy and they click on it. so they may think that they're actually being given control of their personal data. what to collect it, how would she used, and it's advantageous to companies for them to do that. because if something bad happens, the tech company can then say you actually agreed to this, nobody ever reads the terms of service. no one should ever be expected to. if we actually tried to read every terms and conditions agreement that we came across, it's probably the only thing we would do. we would have to be our day job because they're so long we come into so many and may have the veneer of giving control to data subjects. but ultimately it's window dressing. what a hot saga is, what you'd call a school of design. he's been studying the psychology ethics and legal implications of new technologies. one area he specializes in is data protection and privacy. now before we get into this, there's a key time you need to know informed consent. this is a principle that comes up
2:36 am
a lot in discussions of our rights online, but it's easier to explain in the context of medical surgery. a doctor explains potential risks and worst case scenarios to the patient. once you fully informed you have the option to consent to sergio not in the online world. the informed consent is what everyone says ideal. but he said even possible consent only works under a very narrow set of conditions. and that's when the decision is infrequent. like with surgery, so we don't have surgery all the time. it's when the risks are visceral, there are things that we can easily conjure up in our minds. and then finally the harm is possibly great. so if things go wrong with surgery, you could get sick or you could die. so we've got an incredible incentive to take that decisions seriously. but of course, none of those things are present in the data ecosystem. we make decisions quite frequently. 10100 times a day. harms are not visceral at all,
2:37 am
they are incredibly opaque. and finally, the harm is not even that great because the private modern privacy harms aren't huge. their death by a 1000 cuts, the spin from silicon valley's that their own asaad when it comes to how we control our long privacy policies are very confusing. and if you make it long and spell out all the details, then you're probably going to reduce the percent of people who read it. however, take a closer look at the design of the buttons in the pups that when they click, and it's clear that the tech companies have the upper hand in the dot, a bad design is power. and every single design decision makes a certain reality, more or less likely. and what tech companies and psychologists and other people have known for years is that defaults are notoriously sticky. and so if you design the interface so that all the defaults are set to maximize exposure, then you're going to get a lot more data in the aggregate than you would if you said all the defaults to privacy protective. because people don't go in and change them. so until we
2:38 am
fundamentally change the incentives, we're still going to see companies manipulating the design of these buttons and these technologies to ensure that you still keep disclosing data and that they still keep getting what's the lifeblood of their business. most of us assume that when we go on a website and click that, i agree, but the site simply collects information that we voluntarily choose to share. in reality, there are many ways to dot a collection and the mechanics of it are invisible, hidden by design, the study, it isn't just the website you are on, that's money inclination. there are so-called, 3rd party advertises models and analytics agencies tracking using tiny bits of software beacons, pixel tags, they scoop up incredibly detailed information. everything from the computer you're using to how long you hold for a fairly. honestly, it's a bit mind boggling. and all you really did was click about informed consent is
2:39 am
a fundamentally broken regulatory mechanism for algorithmic accountability. it allows companies to continue to throw risk back onto the user and say, you know, here's a, here's a pop up ad. here's a pop up banner that tells you about cookies that no one reads and nobody really even cares about yet. we push forward this regime as though it matters to people. as though, if, if someone clicks, i agree, then they're magically ok with all the data processing that's going to come afterwards. which is a little bit of a joke and it's a little bit of a legal fiction. yet. it's a major component of every, almost every data protection framework around the world. once you've crossed the i agree hurdle. we're now in the clutches of the website. and this is where design takes on a whole new level of importance. the job is to keep one of the most successful innovations in my belly website design is something called internet school. we only
2:40 am
use it every single day. feeling school to sleep through your feet without even needing to click. i'm on my way now to meet the creator of this function. his name is as a rascal, now he no longer works inside be tickled gracious. in early 2018, he co-founded the center for humane technology. all of our apps, all of so compelling companies are competing for our attention. and because it's such a cutthroat game trying to get our attention where, you know, tens of billions of dollars they have to increasingly point more powerful computers at our heads to try to frack us for that attention. and we've all had that experience of going to you tube and you think i'm going to watch one video and then like, somehow you shake your head an hour has passed like what?
2:41 am
why, how is that the technology has hypnotized us. this tech unitized nation is key to what's called the attention economy. our attention is a finite currency in the online world, and it's only as long as websites and apps have our attention that they have our business and our data, the attention economy is just this. it says, if we are not paying for product, well, the come to have to make something money somehow. how do they do it? they do it by selling our attention to advertisers or to other groups that want to do something. they're trying to make these systems as effective as possible at influencing your decisions. quite as much information is about you as they can, like who your friends are, how you spend your time off from the munged with like how you spend your money. if you take all of this data to build a model of you imagine like a little simulator of you that lives in the facebook server and then they can put
2:42 am
things in front of it be like, are you more likely to click this, this or this, or if we want to get you to hate immigration, what kind of message would you are going to resonate with this message or this message? and you can see how this like this or the begins just this race for it. for your attention ends up becoming an entire economy is worth of pressure with very smartest minds in engineering. and the biggest supercomputers trying to make a model of you to be able to influence the kinds of decisions you're going to make . a few years ago you tube set a company wide objective to reach 1000000000 hours of viewing a day. netflix created read, hastings has also said multiple times, that the company's biggest competitor isn't another website. it's sleep. so what happens when you give algorithms the goal of maximizing our attention in time online? they find our weaknesses and exploit them. in 2017, sean parker, a founding member of facebook and its 1st president,
2:43 am
literally confessed to this at an event. how do we consume as much of your time and conscious attention as possible? and that means that we need to sort of give you a little dope. i mean here every once in a while, because someone like her commented on a photo or a post or whatever. and that's going to get you to contribute more content to a social validation feedback loop that it's like. i mean, it's exactly the kind of thing that a marker like less would come up with because you're exploiting a vulnerability and in human psychology. no, it's not as though silicon valley pioneered the tricks and tactics of addiction, a persuasive design. many tech designers openly admit using insights from behavioral scientists of the early 20th century. right, the concept of randomly scheduled rewards study developed by american cycle of just, b.f. skinner. in the 1950 s., he created what's become known as the skin of books,
2:44 am
a simple contraption. he used to study pigeons and even rats at the start of the process, a pigeon is given a food. every time it picks the would pick or turns a full circle when the word turnip is. as the experiment proceeds, the rewards become less frequent. they take place at random, but by that time the behavior has been established. the pigeon keeps pecking or turning, not knowing when it might get the reward. but in anticipation that or award could become. the boxes were pivotal in demonstrating how design had the power to modify behavior, and if randomly scheduled rewards work for pigeons. why not humans? skinner's concept is, in fact at the heart of a lot of addictive design from casino machines such as slot machines to social media. smartphones are unnervingly similar to stop machines. think about your facebook, instagram, a painter, speeds. we all swipe down poles and then wait to see what will appear. we're back
2:45 am
to those randomly scheduled boards again. just what could result in a new comment on a photo or would you like or a piece of spam or software update. we don't really know. and that unpredictability, that makes it so addictive. natasha dashiell is a cultural anthropologist who spent more than 15 years studying the algorithms behind persuasive design. just like on a slot machine, when you're texting or when you're looking through the news feed, you really never know what's coming down the pike. you never know when you're going to sort of hit that jackpot, so to speak. when it's coming and how much it will be. so the randomness is very important to keep you hooked. and i think the fact that we're damn billing money in a casino isn't really that different than what we're doing. because in both cases, what we're really gambling is our attention and our time. right and across the
2:46 am
board. we're sitting there sort of hoping for some little reward to never knowing when it's going to come on in all cases where we're sort of sitting alone with the machine. there's no natural stopping point. i think that the similarities are quite striking as we check our phones over 150 times a day or it just like, put out put up. it's the 1st thing we look at when we wake up. the last thing we look at before we go to sleep, it's like we're glued to it, and that's, that's by design. we know how, by over 2000000000 skinner boxes in people's pockets. we're running the largest experiment like psychological experiment that the world has ever seen by orders of magnitude. one out of every 4 human beings on earth has a skinner box, which is learning how to uniquely target them. super fascinating in the abstract, but sort of tariff. when you think about what it's doing,
2:47 am
the research is being done. the evidence is clear that digital technology is being designed with the intention to make us attics. it's not a secret in the tech industry. 2 of the biggest tech because bill gates and the late steve jobs admitted they consciously limited the amount of time their children were allowed to engage with the products they helped create. the problem is real, but the media can often sensationalize the issue of tech addiction and make it harder for us to understand. think that the reason people read maybe reach for the metaphor of crack cocaine is because they see that as a sort of high speed, hyper intensive form of addiction. and you know, while i don't use that metaphor myself, i do think that within the spectrum of media addictions, there are certain ones that are more high potency, you could say. and those are, you know, if we're going to use the language of addiction, they have a higher event frequency. think about horse race, right?
2:48 am
you go to the track and you've got to really wait for that event to happen if you are rapidly gauged in an activity such twitter thread that is more high potency, it has a higher event frequency, which means each event has an opportunity to draw you in more to reinforce that behavior more so i think we really can apply the language of addiction to these different media. i have a ongoing frustration which is that whenever i am still for a 2nd, i have this impulse to reach into my pocket and pull out my phone and then i get angry at myself because i say that's, that's not right. just just enjoy this moment, right? just be, be with yourself for a 2nd. and then i get angry at myself that my phone has that much power over me, right? and i'm angry that, that i'm subject to the design of a technology in such
2:49 am
a way that i have difficulty sort of resisting its allure. but of course, everything about these technologies is built to, to, to create that impulse, to make it feel as though it's irresistible. it's, there's such emphasis put on the free choice and being able to be a consumer and you make decision. it's in the marketplace about what you want to do because you have free will. but at the same time, the very people who are promoting that notion of the person as a consumer sovereign are operating and designing their technology with a very different human subjects in mind. it's somebody who can like a rat or a pigeon, or any other animal, be incentivised and motivated and hooked and have their attention redirected. and that's really interesting to me because it is a kind of return to skinner. i think you wouldn't have heard that in the eighty's or ninety's that would have even been creeping. you to think about someone
2:50 am
designing your behavior, but now it's become accepted. the you can be a behavior designer. and behavior design was one part of what as they used to do in a previous life. how if he is now one of a growing number of industry insiders who are taking a more critical stance towards silicon, just talking to him, let me wonder. does he regret his pot and want to be like humble about it. if i had invested it would be invested. i just haven't been the right place or the right time to think about the right kind of thing. but yes, i do regret it. but i do think it talks to like the naive 80 of being like, oh here's just a cool feature and making it. and even if it's great for the user without thinking about the effects, it'll happen if you can scale it up 100000000 people or 1000000000 people where this little thing that i went around to like twitter and google and all these other
2:51 am
companies got a guy who should adopt this now is wasted, quite literally hundreds of millions of human hours. i'm sure all of us have had someone say to us, stop looking at your phone or why you so we dictate to social media. and before i started this series, i thought maybe there is something wrong with me. i just had no idea how deliberately designed our online experience is. and how these design algorithms are made to us. it happens. i asked everyone i spoke to, how do we change this? can we change how online design works? regulation cannot be expected to happen on its own within these corporations, right? who are profiting from this because there is just too deep of a conflict of interest. and so the only viable kind of regulation is going to have to come from the outside. we have to have a public conversation about what is actually going on in these products. how are
2:52 am
they working? and once we understand that as a collective, what do we want to limit and constrain? so if you start, if the assumption that people are never going to fully understands the risks of algorithms and the way in which that interacts with the data, then we have to think about what else might work in its place. and the data protection regimes like the general data protection regulation are a great foundation, you know, one of the ways in which we could really improve upon that is to embrace or a trust basis. so instead of putting all of the risk on the user, the requirements of protecting people would fall on the big company that is using the algorithms that's using the data. i think there is going to be a huge shift from just human centered design, as we call it in our field is like put the human at the center which is was
2:53 am
a big movement to thinking about human protective design. we say that the tools we're building are so powerful that they cause real damage to us, individually for mental health, to our relationships, to our children, and to us, to society, toward democracy and to having civil discourse. and that move to human protective design or humane design i think is super hopeful because then we can actually have technology which does what it was supposed to the 1st place, which is extend our best so if you're concerned about the ways in which data is being collected and algorithms are being used to affect your life. there are 3 things i think you can do. one, use the tools that are given to you. use privacy dashboards, use to factor authentication,
2:54 am
which is really valuable to be more deliberate in critical about the ways in which companies companies are asking for your information. and the devices that you adopt from the services you participate in, understand that companies are trying to tell you things through design. and they're trying to make things easier or harder and think about whether you want things to be easier. and one of the costs with making things easier, and i understand that companies are trying to get your consent because their entire business depends on it. and so think about that as you go forward. and finally, i think that design and consent and privacy and algorithms need to be a political issue. and so if someone's running for office, ask them what their stance is on algorithmic accountability. ask them what their stance is on privacy and data collection. because if we get better rules about how the data is collected, not rhythms are used and we might all be better off. we're at this inflection point where our technology is beginning to make predictions about us that are better than
2:55 am
the predictions we make about ourselves. and one of the best ways of binoculars and is by learning about ourselves more dislike, stop and really ask yourself like before you post something to paste poker into. i'm like what, why am i doing this? like, what are my motivations? and the sort of like, slow down your thought process often i've found that it'll be like, oh, i am pulling this app out because i'm a little bit bored or everyone else has pulled up their phones and feeling a little socially awkward. oh, that's curious. or maybe like i'm having this experience and i sort of want to show off just a little bit. and just like stopping and thinking about like what are my motivations for doing things. i found to be a great, an occupation for spending my time in ways that i wish i had. i'll just say that i recommend having a general attitude shift where you understand yourself as an organism, as
2:56 am
a creature that can like any other number of creatures and animals be turned in certain directions. have your attention swayed, caught, captured, hooked. i find it liberating to recognize all of the forces that are exerting these, these different powers on me. she turn my attention one way or another and somehow realizing that helps me to disconnect a lot because it makes me a little bit angry. and i don't feel so bad about myself, i feel bad about what's being done to me. and then i am more able to disconnect. 19 is a public health crisis that has been compounded by capitalism, navigates the big questions raised by the global pandemic. how is the system based
2:57 am
on profit and the pursuit of profit? so the world in a time of us, capitalism is the pandemic. that so much of the suffering exploited to protect the people for the profit episode, one of the full hail the meltdown on as a 2nd wave of covert, 19 brings a surge in infections. a few months ago, there were dozens of cases a day. now, if we chose 2000 and countries and force new measures to curb contagion, this is the 1st step for the government aim of mass testing in top population. scientists are on the brink of releasing new vaccines to reduce the spread of the
2:58 am
virus. will it be enough to bring the global health crisis to an end? the coronavirus pandemic, special coverage on a romania's ancient forests. some of europe's most pristine. they are crucial for all society and crucial for all battle against a climate crisis. but illegal logging by a ruthless to the mafia is destroying both the landscape and people's lives. being in romania is all what our young violence feeling was a rolls amidst claims of corruption and the role of powerful multinationals. people in power investigates, rumania break with the far east on al-jazeera, jumped into the story. and julian on global community bio diversity is bio security . it is that essential for our species to survive? be part of the debate. i know you have my days and you can be part of this conversation. when no topic is off the table, the police are not neutral and all of these cases goal here is to terrorize. and
2:59 am
here's the other part of this. there's no consequence, this stream on out is their service and has been in turmoil. i'm sure my police cars are the target and do not like time. your dad well to make a political hole in my city. are you from the state representative back in 9091 to me i was out of our game, one of the guys without a gun. my brother was here with my hood. don't look. no, dear friend, any other good out here deal michael brown was give me my son in 15 years and i felt like you know it just my time to stand. this is the most and bill for us . i'm just not willing to accept the word sustained. so let's just play shit. can i get through? that's going to speak to a major need for my community. this deal identifies use violence as a public health epidemic. last year we had $200.00 murders,
3:00 am
the report of violence when it comes to the youth. it stretches for a while. when you were saying goodbye to a football legend, thousands in argentina paid tribute as diego maradona is laid to rest. and i there i missed on the attack and this is al jazeera live from her also coming up to a company astra. zeneca says it may need to run a new global trial for its current, a virus vaccine, amid questions about its efficacy. half a 1000000 people in the firing line. ethiopia's government says the final offensive in tikrit has become.
64 Views
Uploaded by TV Archive on
