- 2 months ago
- #2073movie
- #timetravel
- #scifithriller
- #fullmovie
Inspired by Chris Marker's iconic La Jetée, '2073' is a gripping Hollywood English movie that delves into the challenges facing the world. This sci-fi thriller follows a courageous time traveler who risks everything to alter the course of history and secure the future of humanity. Experience a profound journey through time where the fate of mankind hangs in the balance.
sci-fi thriller time-travel future humanity hollywood-movie english-film 2073
#2073Movie #TimeTravel #SciFiThriller #FullMovie
sci-fi thriller time-travel future humanity hollywood-movie english-film 2073
#2073Movie #TimeTravel #SciFiThriller #FullMovie
Category
🦄
CreativityTranscript
00:00:00Transcription by CastingWords
00:00:30Transcription by CastingWords
00:01:00Transcription by CastingWords
00:01:30Transcription by CastingWords
00:02:00Transcription by CastingWords
00:02:30Transcription by CastingWords
00:03:00Transcription by CastingWords
00:03:29Transcription by CastingWords
00:03:59Transcription by CastingWords
00:04:29Transcription by CastingWords
00:04:59Transcription by CastingWords
00:05:29Transcription by CastingWords
00:05:59Transcription by CastingWords
00:06:29Transcription by CastingWords
00:06:59Transcription by CastingWords
00:07:29Transcription by CastingWords
00:07:59CastingWords
00:08:29Transcription by CastingWords
00:08:59Transcription by CastingWords
00:09:29Transcription by CastingWords
00:09:59Transcription by CastingWords
00:10:29Transcription by CastingWords
00:10:59TranscriptionWords
00:11:29Transcription by CastingWords
00:11:59Transcription by CastingWords
00:12:29Transcription by CastingWords
00:12:59Transcription by CastingWords
00:13:29Transcription by CastingWords
00:13:59CastingWords
00:14:29TranscriptionWords
00:14:59TranscriptionWords
00:15:29There are trans-
00:15:59Can you tell me, can you tell me, can you tell me, can you tell me, can you tell me, can you tell me, can you tell me, can you tell me why you're not going to change this next thing?
00:16:01It's disrupting democracy.
00:16:03Social media has done.
00:16:04Social media has graduated.
00:16:06social media has enabled the rise of these populist authoritarian-style leaders.
00:16:12We will root out the communists, Marxists, fascists, and the radical left thugs that
00:16:19live like vermin within the confines of our country. We will make America great again.
00:16:27So you would say that white supremacist-tied publications meet a rigorous standard for
00:16:38fact-checking?
00:16:41Facebook's site has been used to incite violence against Rohingya refugees in Myanmar.
00:16:47Mr. Modi has driven Muslims of this country to the wall.
00:16:52I have a particular responsibility. It is to end the free movement of people once and for all.
00:16:58Europe is in the heat of a battle about the future of democracy, and Hungary is now the
00:17:03front line.
00:17:05It's not just Krugman and Orban. There's an alliance of dictatorships. There's no subtlety
00:17:12in Netanyahu's ambition to claim all of the Palestinian state. There's a global effort by people in
00:17:18power to stay in power. They care more about maintaining power for their clique than a broader
00:17:27sense of common good.
00:17:28Hitler massacred 3 million Jews. There's 3 million Dragarios. I'd be happy to slaughter him.
00:17:38The volume of killing here is quite extraordinary. In the 80 or so days that Duterte has been in power, there's now been an average of somewhere between 35 and 40 police killings of drug pushers and drug users every day.
00:17:52So no qualms about killing killers.
00:18:07So no qualms about killing killers.
00:18:11Yes, of course. I must admit that I have killed.
00:18:16I'm a journalist. This is my 37th year as a journalist. What's the role of journalism?
00:18:21Journalism, the mission, holding power to account, public and private.
00:18:25As president, you now also defend the Constitution.
00:18:29Yes.
00:18:29You break the law. You threaten to break the law. You said you had killed it a year ago, right? You told me that. And yet, you now have the task of keeping the rule of law. And you said you would do that also. How do you?
00:18:41Because the rule of law, that must be failed.
00:18:43When Duterte took office, our institutions crumbled. He became the most powerful person within six months.
00:18:51Just because you're a journalist, I thought you were exempted from a certain issue. What's wrong?
00:18:56We continue doing our reporting.
00:18:58On the worst day of the violence, when murder and looting were taking place all across the city,
00:19:25we saw policemen just standing by, watching what was happening, but doing nothing to try to stop it.
00:19:34When you look back over the last months, you've been the leader of this stage through a very difficult period.
00:19:40Do you think there's anything that you should have done differently?
00:19:43Yes. One area where I was very, very weak, and that was how to handle the media.
00:19:50He doesn't like criticism. He's self-obsessed. He's a megalomaniac. And I say that all the time.
00:19:58My name is Rana Ayoub. I'm a global opinions writer with the Washington Post.
00:20:03I'm an investigative journalist. I write stuff which the government does not like.
00:20:08Mr. Modi is going from strength to strength. He's become a leader of anti-Muslim bigots across the country.
00:20:14I spoke about the fact that here is a man who's an authoritarian, whose only politics is his prejudice of Muslims,
00:20:23whose only politics is persecution of Muslims. And if this man comes to power, he's going to repeat that.
00:20:29And that's exactly what has transpired since Modi came to power in 2014.
00:20:33I believe that I'm standing up for just bearing witness. I don't even want to do anything.
00:20:49I don't want to be anybody's hero. I don't want to be put on a pedestal.
00:20:52I just wanted to speak my truth.
00:20:55They know this referendum is too close to call and are stepping up the intensity with just a week to go till polling day.
00:21:18When ISIS say they will use the migrant crisis to flood the continent with their jihadi terrorists, they probably mean it.
00:21:25There's no good fighting an election campaign on the facts.
00:21:29Now this is a once-in-a-lifetime chance for us to take back control of our immigration system.
00:21:36Because actually it's all about emotion.
00:21:38The majority of people in this country are suffering as a result of a membership of the European Union.
00:21:43These are honestly, this is all you need to be true, as long as they have belief that you can.
00:21:47If people feel that voting doesn't change anything, then violence is the next step.
00:21:59What I and other journalists have uncovered is that multiple crimes took place during the referendum.
00:22:06It was the biggest electoral fraud in Britain for a hundred years.
00:22:11I'm a journalist. I kind of came in to investigate a journalist accidentally when I was a feature writer.
00:22:16This is what they were posting on Facebook, paid for with illegal cash.
00:22:21How do you hold power to account? That's all I'm trying to figure out.
00:22:25It's like, what are the different mechanisms for holding power to account?
00:22:29Because the old ways don't work anymore.
00:22:31We have no idea who saw what ads, or even who placed the ads, or how much money was spent, or even what nationality they were.
00:22:40When I stumbled into this story, it was really like the scales falling away from my eyes when I understood so much is actually controlled beyond the naked eye.
00:22:50If you're interested, what I'd like to do is set up something, and I'll fund it somehow, that I think, and I think you're the perfect guy,
00:22:59we help knit together this populist nationalist movement throughout the world.
00:23:03Because guys in Egypt are coming to me, the Modi's guys in India, Duterte, you know, and we get Orban, and even think,
00:23:10and we're somehow, some sort of convening authority.
00:23:14It's a global revolt.
00:23:26The game changer was the weaponization of social media.
00:23:32I got 98 messages per hour.
00:23:36That's when I began to realize this is really different.
00:23:39These tech companies enabled a real shift, because exponential lies literally came at us so fast that our human capacity to absorb and fight back was gone.
00:23:588.5 million tweets were directed against me, in which the language is so similar.
00:24:04Jihadi, Burqa clad, ISIS sex slave, you know, terrorist sympathizer.
00:24:10I personally got really targeted.
00:24:13There was this systematic harassment going on online.
00:24:17It became overwhelming.
00:24:20The weaponization of social media was quickly followed by the weaponization of the law.
00:24:25A prominent critic of Philippine president, Rodrigo Duterte, has been arrested.
00:24:36Maria Ressa is CEO and executive editor for the online news site, Racklark.
00:24:40Journalist Rana Ayub, who's been a fierce critic of the government, led by Prime Minister Modi,
00:24:45has been named in a charge sheet filed by the enforcement directorate over alleged money laundering.
00:24:50Carol Cadwallader, as a result of your reporting on Brexit, you have been sued by a British businessman.
00:24:57This boys' club reads each other's manuals.
00:25:01Okay, what have we done?
00:25:03Minorities silenced.
00:25:04Media silenced.
00:25:06Dissenters arrested, killed.
00:25:08Social media, we have co-opted them.
00:25:10Infibition technology, we have used them.
00:25:13They all follow the same playbook.
00:25:14Now it fits.
00:25:16Now it fits how you make journalists and critics and activists enemies of the state.
00:25:21If you don't have facts, you can't have truth.
00:25:24Without truth, you can't have trust.
00:25:26If you don't have any of these three, you don't have a shared reality, you can't solve any, like,
00:25:33let alone existential problems like climate change, you can't solve any problem.
00:25:37You cannot have democracy.
00:25:41Isn't this a science fiction movie?
00:25:56Grandma read newspapers, watched TV news non-stop.
00:26:06She recorded everything.
00:26:09Said she was keeping the receipts.
00:26:11I didn't understand.
00:26:14I was too young.
00:26:17News disappeared.
00:26:18Just like Grandma.
00:26:19Nobody noticed or cared.
00:26:38Name.
00:26:39Where were you born?
00:26:51Where in New San Francisco have you visited?
00:26:54I was so young.
00:26:58There were so many people on the field.
00:27:05We didn't see so much.
00:27:09We didn't see.
00:27:12We didn't see.
00:27:13We didn't see.
00:27:17We didn't see.
00:27:17We didn't see.
00:27:18We already saw the conversation.
00:27:19We did see.
00:27:20We just saw the conversation.
00:27:21We didn't see.
00:27:22We'll see.
00:27:24Every day I pass the same people on my way up.
00:27:33The girl who used to sing is gone.
00:27:40It's like one by one, they're turning into ghosts or grains of sand.
00:27:53There was one place, a secret place my grandma used to take me to.
00:28:24When I was small, I thought this was where people came to pray.
00:28:33It was a place where you could read books.
00:28:38There was special writing too.
00:28:41About things you couldn't see.
00:28:45Where you could meet people and not be afraid they'd rat you out to the authorities.
00:28:53It's where my grandma left me everything she'd saved.
00:29:04All the memories.
00:29:12And all the recordings she'd made.
00:29:17Her receipts from her life.
00:29:19And from a long time before too.
00:29:21I can feel people all around me.
00:29:37My parents are here.
00:29:41And my grandma too.
00:29:43It's our memory they want to wipe out.
00:29:49Language.
00:29:51Culture.
00:29:53And history.
00:29:57And people connecting.
00:30:00People protesting.
00:30:01This is why they came for grandma.
00:30:12They said it's wrong.
00:30:16They said it's a crime.
00:30:21And I know one day they'll come for me too.
00:30:23The event wasn't just one thing.
00:30:35It was a slow creep.
00:30:38Places we used to visit got taken over.
00:30:41Walls got built.
00:30:43Access was denied.
00:30:46And you're desperate.
00:30:48Just surviving.
00:30:49You can't see what's been taken away.
00:30:55You can't fight back.
00:30:57You can't.
00:31:21You can't.
00:31:21You can't.
00:31:21Let's go.
00:31:51My grandma used to tell me how when she was a girl, she played with other kids in the park.
00:32:21And one by one, they stopped showing up.
00:32:28Some of them got sucked into screen images.
00:32:34Some of them got suicidal.
00:32:40Others vanished.
00:32:56Some of them got stuck in the car, and they stopped showing up.
00:33:12Some of them got stuck in the car.
00:33:20Oh, my God.
00:33:50We are Uyghur people, historically from Central Asia, a Turkic Muslim group like Kazakhs and Uzbeks and Tatars. We are Sunni Muslim.
00:34:06The Chinese government always had this paranoia that the region is not safe, that Uyghurs always have the desire to seek independence.
00:34:20And therefore, it must use every power to make sure it is completely controlled.
00:34:29When you're looking at the World Trade Center, we understand that a plane has crashed into the World Trade Center. We don't know anything more than that.
00:34:37In 9-11, when Bush declared war against terror, the Chinese government conveniently said, we also face the Muslim terrorists, branding Uyghurs as terrorists.
00:34:52This was the escalation to impose strict policies, to make it legal, to arbitrarily detain and kill wicked people.
00:35:10Then this kind of high-tech surveillance started, using cameras, facial recognition.
00:35:25They set up a working group to examine every person, every family, to check who has travelled abroad, who ever signed any petition, ever protested for anything.
00:35:44They categorised the entire population as normal, suspicious and untrustworthy.
00:35:57They built the biggest reservoir of data ever collected.
00:36:01Every Uyghur aged over 12 must go to the hospital to provide health profile, DNA, fingerprints, iris scan.
00:36:27Facial recognition from various different sides of their face, and also voice recognition.
00:36:40They had to read 20 times one paragraph of newspaper, establishing bio data.
00:36:50And then the computer can automatically give a result, whether you are normal, you are suspicious, or untrustworthy.
00:37:04The Uyghurs have many of the same complaints as the Tibetans.
00:37:11They say they're discriminated against, that they're not allowed to practice their religion and culture freely.
00:37:18And they're afraid to speak in public, especially now.
00:37:22One man said to me, you talk today, and tomorrow your whole family disappears.
00:37:28Human rights campaigners say that over a million people have been interned in concentration camps in the far west province of Xinjiang.
00:37:45It's probably the largest internment of an ethnic or religious minority since the Holocaust.
00:37:50Those who they believe potentially are untrustworthy, not even dangerous, potentially have to go to these so-called re-education camps.
00:38:04They completely re-engineered them, so that Uyghur people were given two choices.
00:38:19You will either become a Han Chinese person, or you die.
00:38:26You will either become a Han Chinese person, or you die.
00:38:45We're on the precipice of probably the biggest revolution that the world has gone through in modern times.
00:38:53China is the blueprint, because this technology is for sale.
00:39:20Israeli soldiers use facial recognition software to take photos of Palestinians in order to scan their faces into this Wolfpack database.
00:39:31It's the latest component, really, of a system of total surveillance of the Palestinian population.
00:39:37Israel has found a way to control millions and millions of Palestinians through surveillance technology, through weapons, through drones.
00:39:50A myriad of surveillance and repression, and Israel says, if you want to have good relations with us, we'll sell you this amazing technology, so you can also surveil your dissidents and critics.
00:40:09The technology is integrated into warfare, and there are algorithms that help guide military strategy that allow for death and destruction among civilians.
00:40:21You're removing people from the decision-making, so you don't even have any capacity to bring a perspective on whether or not we should drop a 2,000-pound bomb on a refugee camp.
00:40:30A superhero.
00:40:31Yeah, and if you want to step up.
00:40:45Move where the док title!
00:40:50the way in which we communicate and live is changing
00:41:10the possibilities for what people in positions of power can do
00:41:16and the way that they can intrude into our lives into our relationships has all
00:41:22changed because we have this totalitarian architecture around us you
00:41:27only need a change of government or change of circumstances before it's used
00:41:31in a totalitarian way when it's too late
00:41:46names of relatives and associates
00:41:57what groups or organizations do you connect with
00:42:05do you have contact with any illegals in America
00:42:35I believe there will be a clash between those who want freedom justice and
00:43:02equality and those who want systems of exploitation
00:43:17do you have a tea pot?
00:43:25do you mind?
00:43:32water
00:43:52Water.
00:44:22Greek Mountain Tea.
00:44:27My secret store.
00:44:32I'm surprised it still has any scent left.
00:44:39Perhaps there's still someone out there collecting it from the hills.
00:44:46Perhaps there's still hills out there.
00:44:53Perhaps there's a Greece, a Japan.
00:44:59If they're not both under the sea.
00:45:03There is no better teacher than adversity.
00:45:19Every defeat, every heartbreak, every loss contains its own seed.
00:45:32They killed the guy that said it, but it's still true.
00:45:38That's right.
00:45:42Somebody has to win.
00:46:02She used to be a professor of history.
00:46:17Until AI decided what she taught was obscene and criminal.
00:46:21It's when she came down to live with us.
00:46:25This was the last time I saw her.
00:46:27She vanished soon after.
00:46:30Happens all the time.
00:46:34That's how I know she was a true friend.
00:46:37Someone I could trust.
00:46:42But it's too late now.
00:46:44She's gone.
00:46:49She's gone.
00:46:54Come on.
00:46:58Welcome.
00:46:59Wow.
00:47:00There's a new set of kings that nobody voted for.
00:47:04set of kings that nobody voted for and they have more money and power than any
00:47:11corporation ever has. Tesla boss Elon Musk is now the richest person in the
00:47:16world with a net worth of 185 billion dollars. Facebook's got 46 billion in cash.
00:47:22The cash flow somewhere around 10 and a half 11 billion dollars. Google's parent
00:47:26company Alphabet surged for the full year revenue rose to nearly 75 billion
00:47:31dollars that now makes Alphabet the world's most valuable company ahead of
00:47:35Apple. Just want to jump in with some Amazon numbers. Revenue 89 billion. They are
00:47:41powerful not simply because they have all the money and the wealth but because
00:47:46they've amassed all of the information. Age, gender, ethnicity, religion, what car you
00:47:51drive, what products you purchase in shops, what churches you attend, how you see the
00:47:55world, what actually drives you, how open you are to new experiences, whether you
00:48:00prefer order and habits and planning in your life, how social you are, how much
00:48:04you tend to worry. Big data is an understanding of your personality because
00:48:08its personality that drives behavior and behavior obviously influences how you
00:48:13vote.
00:48:15These technology companies have become behavior modification systems for sale to
00:48:22the highest bidder.
00:48:29We are worth more when we are dead slabs of predictable human behavior than when we
00:48:34are as living, breathing informed citizens of a democracy. We are worth more when we are
00:48:39addicted, outraged, distracted, polarized and disinformed than if we are living, breathing,
00:48:45free humans and citizens. Technology domesticates us into the new kind of human.
00:48:54The last time human beings were commodified, it was the age of industrialization and it was
00:48:59labor. And when labor was commodified, we had sweatshops, we had factory lines, we had child labor, we
00:49:07had exploitation of our physical bodies. Now what's commodified is not our physical bodies, it's our
00:49:17attention.
00:49:22So what are they going to do next?
00:49:28In 2017, a new AI engine got invented. These things are generative, large language,
00:49:43multimodal models. These models treat absolutely everything as language.
00:49:53Everything human beings do runs on top of language. Our laws, the idea of a nation state,
00:50:08friendships and relationships. And just like AI can now translate between human languages,
00:50:14you can translate between almost anything. Images can be treated as language.
00:50:20Sound becomes a language. DNA is just another kind of language. This becomes the total decoding
00:50:27and synthesizing of reality.
00:50:32These models are so sophisticated that they can actually autonomously generate anything that
00:50:37you might think is unique to human creativity or human intelligence.
00:50:43It is a massive cybersecurity issue. It's a national security issue. It is a potent threat for
00:50:59geopolitics, election hacking, because anybody's identity can be assumed and appropriated.
00:51:04There's absolutely no guardrails, no laws, nothing that prevents the tech from gambling on this
00:51:17without any consequences. The longer nation states do not use their power to regulate artificial
00:51:24intelligence, the more data and power it gets, which usurps the powers of nation states.
00:51:31This is right up front changing our lives.
00:51:43Every worker has a scanner at all times that basically tracks exactly where you're at.
00:51:49What they're doing is they're producing this massive data that they are using to be able to analyze
00:51:54the entire workforce.
00:51:59We're not treated as human beings. We're not even treated as robots. We're treated as part of the data stream.
00:52:09We are now solving problems with machine learning and artificial intelligence that were in the realm of science fiction for the last several decades.
00:52:21There will come a point where no job is needed. The AI will be able to do everything.
00:52:28So how do we build a society where it's not just the owners of all the machines and they own all the wealth and then all the rest of us are kind of surfs.
00:52:41The wealthy getting wealthier, the poor getting poorer, the poor getting poorer.
00:52:53The AI will exacerbate that. And that I think will tear society apart because the rich will have just too much.
00:53:00And those who are have nots will have very little way of digging themselves out of the hole.
00:53:13What we're looking at is a paradigm change in human communication, even human evolution.
00:53:19The key question is, who controls the machines?
00:53:24I am the reason OpenAI exists.
00:53:26Perhaps the most important foundational technology of our time, artificial intelligence.
00:53:31Basically, there's no institution in the world that cannot be improved with machine learning.
00:53:43If we continue giving our money to these technology corporations, these people will have more power and wealth than the entire world.
00:53:53And that's what the most alarming thing is.
00:53:58Ladies and gentlemen, please welcome on stage, Peter Thiel.
00:54:08From a libertarian perspective, the Western governments are not working that well.
00:54:14There are all these things that, you know, they're not, they're not very competent.
00:54:17Peter Thiel, the most influential venture capitalist in Silicon Valley.
00:54:22As a libertarian, I'm always sort of anti-government, anti-the state.
00:54:27But there's also been an extraordinary decline of competence of the U.S. government.
00:54:32He is ultra-libertarian. I mean, it's like a real bring-down-the-system ethos.
00:54:37Our system only works with growth. If you try to have a zero-growth society, we represent a tremendous break from our past.
00:54:45Billionaires should be allowed to pay less in taxes. Businesses should be allowed increasing freedom, you know, bordering on the freedom to kind of run the world.
00:54:54Corporations are underrated because so many of these other institutions do not work.
00:55:00It's one of the things that's endlessly frustrating to a lot of the very successful people in Silicon Valley.
00:55:06He's not just this singular figure. He's somebody whose ideas have proliferated.
00:55:11Following Thiel is sort of this path through the history of Silicon Valley.
00:55:15It's obviously very dysfunctional to have, you know, mass homelessness. It's very dysfunctional to have no law and order.
00:55:24But if you think of it as a sort of inefficient redistributionist strategy, there's a lot of it that has a certain weird, perverse logic.
00:55:33In San Francisco, where I lived for, you know, 15 years, the homeless people were in the lower parts of town.
00:55:39They didn't climb up the hills. The value of the houses on the hills went up way more than the value of the houses on the flat parts went down.
00:55:47And so you have to think of the homeless people as like this feature to increase the value of the higher-end real estate in the city.
00:55:55Peter Thiel owns this company, Palantir, a Silicon Valley big data company, which has got its claws into the British government and it's infiltrating itself into the NHS.
00:56:13Concordance wants to fix the fragmented healthcare supply chain.
00:56:18This new health service will be organised on a national scale as a public responsibility and so everyone will pay for it and everyone will benefit from it.
00:56:28When you're ill, you won't have to pay for it.
00:56:31What we have in the NHS is an entire nation's medical history from cradle to the grave.
00:56:39That is a vast data set at an individual level and then at a nationwide level.
00:56:46So if you want to do anything with healthcare and big data, it is a massive asset.
00:56:55US tech firm Palantir Technologies has won the biggest IT contract in the history of NHS England.
00:57:02The company have been awarded a £330 million deal to provide AI software to bring together the data of patients.
00:57:09It's really, really important. People are transparent with what ultimately is our data.
00:57:14The first funding that Palantir ever got was actually the venture capital arm of the CIA.
00:57:20And up until very recently, all of its customers were the government.
00:57:24This includes the Department of Justice, Department of Defense, Department of Homeland Security, Immigration, Customs Enforcement.
00:57:32The idea that a company like that, a company involved in surveillance, in counter-terrorism, in border enforcement has any place in the NHS, which is ridiculous.
00:57:43He's ahead of the curve. And I want to thank him for a very special fact.
00:57:48Peter Thiel is a perfect example of a libertarian tech bro who believes that what we really need is to get away from the reins of the state and just set up some kind of free-for-all system where the billionaire Ubermensch can live unfettered by other people.
00:58:06We have to save this planet, and we shouldn't give up a future for our grandchildren's grandchildren of dynamism and growth.
00:58:13We can have both. Manufactured worlds, rotated to create artificial gravity with centrifugal force.
00:58:20These are very large structures, miles on end, and they hold a million people or more each.
00:58:27High-speed transport, agricultural areas, some of them would be more recreational.
00:58:34You could have a recreational one that keeps zero-G so that you can go flying.
00:58:38These are ideal climates. These are shirt-sleeve environments. People are going to want to live here.
00:58:44Techno-libertarians really hoping for some kind of a techno-monarchy where this group of elite tech bros creates the blockchain and the algorithms that automate our reality under their benevolent programming godlike wisdom.
00:59:02It's survival of the richest because they're preparing for a civilization-threatening catastrophe in their lifetime.
00:59:24The event. The thermonuclear war, the electromagnetic pulse, the climate catastrophe, the pandemic, the economic revolution.
00:59:35Whatever it is that makes life for them unlivable.
00:59:40So they're running around trying to make toys that somehow fix things.
00:59:46Like underground bunkers.
00:59:51And rocket ships.
00:59:59I want to thank every Amazon employee and every Amazon customer because you guys paid for all of this.
01:00:08It's always an ends justifies the means journey toward their better place.
01:00:18And externalizing a tremendous amount of horror onto whoever isn't on board their particular thesis.
01:00:27I wanna thank everyone.
01:00:28I wanna thank the ainsi and have внимание.
01:00:30Please tell us to live here and please welcome them.
01:00:36I wanna thank dalej.
01:00:37I go back to the nearest company.
01:00:40I never think of it as much as much as possible.
01:00:42Right now certainly got to understand and do whatever the hi crisps linked to us.
01:00:46It's no гол rust.
01:00:47You can't really study eram during natural timing is still Heinrich L honor,
01:00:50the hacker system.
01:00:52It's plenty of time letting people know yourselves at home with your forever.
01:00:54I don't know.
01:01:24I don't know.
01:01:54I don't know.
01:02:24I don't know.
01:02:54I don't know.
01:03:24I don't know.
01:03:54I don't know.
01:04:00Could this world and my life have been different if I'd have done something, stood up, fought
01:04:08back?
01:04:09What difference could I make on my own?
01:04:14What could I make on my own?
01:04:15I don't know?
01:04:16I don't know.
01:04:21I don't know.
01:04:22I don't know.
01:04:27I don't know.
01:04:33I don't know.
01:04:39I don't know.
01:04:40I don't know.
01:04:45I don't know.
01:04:46I don't know.
01:04:51I don't know.
01:04:52I don't know.
01:04:53I don't know.
01:04:54I don't know.
01:04:55I don't know.
01:04:56I don't know.
01:04:57I don't know.
01:04:58I don't know.
01:04:59I don't know.
01:05:00I don't know.
01:05:01I don't know.
01:05:02I don't know.
01:05:03I don't know.
01:05:04I don't know.
01:05:05I don't know.
01:05:06I don't know.
01:05:07I do, you know?
01:05:08I don't know.
01:05:10I don't know.
01:05:13I don't know.
01:05:14I don't know.
01:05:15I don't have a breakthrough.
01:05:18I don't know.
01:05:19I don't know.
01:05:20I've been teaching for you.
01:05:24We're running faster and faster through the woods,
01:05:39increasingly blind with less steering and control.
01:05:46Things are going faster and faster and faster.
01:05:48We no longer have a grip on time itself.
01:05:52There are geological natural changes.
01:05:54There are social changes and there are technological changes.
01:05:57And I think mostly, we don't feel that we're in the driving seat.
01:06:01We feel that these forces are driving us.
01:06:08Oh my God!
01:06:09Oh my God!
01:06:13Oh my God!
01:06:14Oh my God!
01:06:16Oh my God!
01:06:18Oh my God!
01:06:20Oh my God!
01:06:22Oh my God!
01:06:23Oh my God!
01:06:24Oh my God!
01:06:24Please, oh my God!
01:06:27Oh my God!
01:06:28Please leave me with me.
01:06:29Please, Shh.
01:06:32Please, please, please, let me out of here.
01:06:34Please?
01:06:35Please?
01:06:35Please?
01:06:37Whoa!
01:06:38Whoa!
01:06:39Whoa!
01:06:41Oh my God!
01:06:42�
01:06:53We have known for quite some time
01:07:20that climate change acts as a threat multiplier
01:07:23for all dimensions of society.
01:07:26And now we have our world on fire.
01:07:30We're losing lives.
01:07:33We've got infrastructure buckling.
01:07:36We should be accelerating into the energy transition now.
01:07:40We are truly in a climate emergency.
01:07:52It's here and now.
01:07:53It's not in the future.
01:07:59We have faced a lot of bad stuff in the past.
01:08:02Wars and famines and genocides.
01:08:04But this is bigger than any other.
01:08:08If we allow dictators, billionaires and private corporations
01:08:14much more power,
01:08:16they will allow the planetary scale trashing of ecosystems.
01:08:22And we will see the collapse of almost all life on Earth.
01:08:27Our duty as people who care about other people
01:08:30is to oppose those who don't care about anyone
01:08:33except themselves.
01:08:35Because if we do nothing,
01:08:37we face mass extinction.
01:09:00Get out!
01:09:06Ah!
01:09:11Ah!
01:09:11Ah!
01:09:12Ah!
01:09:14Ah!
01:09:22Ah!
01:09:24Ah!
01:09:30Ah!
01:12:00It's my turn.
01:12:03Name.
01:12:29Where were you born?
01:12:36Where in New San Francisco have you visited?
01:12:48What groups or organizations do you connect with?
01:13:09What is 2 plus 2?
01:13:31Which books have you read?
01:14:00Do you pray?
01:14:01Would you like to self-confess your crimes?
01:14:08Would you like to self-confess your crimes?
01:14:09Would you like to self-confess your crimes?
01:14:21What?
01:14:22What?
01:14:23What?
01:14:27What?
01:14:28What?
01:14:29What?
01:14:30What?
01:14:39What?
01:14:40I really feel like we're already living in a science fiction world?
01:14:46I really feel like we're already living in a science fiction world?
01:14:47And in this world, individual will is gone.
01:14:53That's the reality we're already living in a science fiction world?
01:14:54I really feel like we're already living in a science fiction world?
01:15:00What?
01:15:01What?
01:15:02What?
01:15:03What?
01:15:04What?
01:15:05What?
01:15:06What?
01:15:07What?
01:15:08What?
01:15:09What?
01:15:10What?
01:15:11What?
01:15:12What?
01:15:13What?
01:15:14What?
01:15:15What?
01:15:24What?
01:15:25What?
01:15:28This year, 72% of the world is under authoritarian rule.
01:15:44They crush democratic institutions from within.
01:15:48They don't stay in their countries.
01:15:50They then ally globally, and that's where we begin to see the power shifts.
01:15:56Time is ticking.
01:15:59If we do not act when we can, we will lose free will.
01:16:05That's the tipping point.
01:16:07Are we going to fall off the cliff?
01:16:11Will democracy survive? Will fascism win?
01:16:13Or are we going to make it that way?
01:16:18Have you ever想ed to fool you?
01:16:19Will
01:22:07Where is your tracking device?
01:22:16Do you have contact with anyone who has been re-educated?
01:22:19What are you going to do?
01:22:26What are you going to do when you are released?
01:22:32What is 2 plus 2?
01:23:02What are you going to do?
Be the first to comment