Skip to playerSkip to main content
* Could Google Be Prosecuted For Election Interference > National Security Risk * Nothing Can Stop What Is Coming *
.
#News #Politics #Trump 47 #Juan O Savin #Nino #Jennifer Mac #Michael Jaco #Education #Republican #USAID #Documentary

Category

πŸ—ž
News
Transcript
00:00One of these people came out and said, Dr. Epstein, I don't mean to scare you,
00:06but based on what you've told us and the work you're doing,
00:09he said, I suspect that you're going to be killed in some sort of accident in the next few months.
00:16This Harvard scientist got death threats, not for politics, not for fame,
00:20but for exposing something most people aren't ready to hear.
00:23And it gets bigger and bigger, but as it gets bigger, it also gets more scary
00:29because we start to run into pushback, I suppose you could say.
00:37I mean, I've had other warnings, too. I've had other warnings.
00:40He said, and I just want to tell you two things.
00:43He said, number one, you have their attention.
00:47And number two, if I were you, I would take precautions.
00:53Oof.
00:56That's scary.
00:57We've had now six incidents in total.
01:00His name is Dr. Robert Epstein, trained by B.F. Skinner, published in Nature,
01:05respected in every academic circle, until he asked the wrong question.
01:10What if we're not as free as we think?
01:13But what they have done more and more is use very powerful techniques of mind control
01:19to control people's thinking and behavior literally around the world on a massive scale.
01:26And they know exactly what they're doing.
01:28I mean, for years, I was just doing experiments and discovering these techniques
01:32and very skeptical, always very skeptical, and yet then replicating it
01:38or other labs have replicated these same findings.
01:41The point is that over time, I realized this is a really, really serious problem.
01:50And over time, my findings were also revealed by whistleblowers, leaked documents, leaked emails.
02:00What he found? Digital blacklists, DNA tracking, invisible algorithms that could reshape your beliefs entirely.
02:07And the bigger question is, if you're not the one choosing what you believe, then who is?
02:13Let's rewind the clock.
02:14Back in 2015, this Harvard-trained psychologist published something that should have stopped the internet cold.
02:20Epstein called it the search engine manipulation effect, or S-E-M-E.
02:25And what it proved is this.
02:27Simply changing the order of search results can silently shift a person's opinion.
02:31In his experiments, people were often shown different search results for neutral topics.
02:36That's it. No fake headlines, no ads, just links reordered.
02:41But the impact was massive.
02:43Up to 20% shifts in opinion.
02:45In some tests, it rose to 80%.
02:48And nearly 9 out of 10 people never noticed the manipulation had happened at all.
02:53They believed they had formed their own conclusions, entirely unaware that the path they walked down had already been laid out for them.
02:59I was told this literally by Zach Voorhees, whom you may have heard of.
03:02He's one of the most prominent whistleblowers from Google.
03:05They can turn bias on and off like flipping a light switch.
03:10This wasn't some fringe internet hack.
03:12This was a repeatable effect.
03:14It worked in India.
03:15It worked in the U.S.
03:16It worked on people of all ages.
03:19Google didn't need to lie.
03:21It just needed to decide what came first.
03:24And the scary part?
03:25Most people never even noticed it happened.
03:28Because once the belief had shifted, the browser tab closes and the trace is gone.
03:33But what if closing your browser isn't enough anymore?
03:36What if the system isn't just shaping what you see, but quietly rewriting what you are?
03:41Google started out as a search engine.
03:44Yet today, it owns your steps, your sleep, your heartbeat, and, if you let it, your genetic code.
03:51In 2018, Google's sister company Verily launched Project Baseline, a nationwide health study collecting real-time biological and behavioral data.
03:59The goal, obviously, to predict illness before it happens.
04:03But the real goal, that depends on who you ask.
04:06Around the same time, Google also held early investment ties to 23andMe, the popular DNA testing service.
04:12And here's what Dr. Epstein reveals about Google using 23andMe devices to collect your information.
04:17That is how Google, that's one of many methods Google has come up with in recent years for getting your people's DNA information into personal profiles.
04:27If you can get DNA info in there, its value goes up and up and up as more discoveries are made about DNA.
04:34And basically, you can figure out what diseases people are prone to that they haven't even gotten yet.
04:42You can also figure out which dads have been cuckolded.
04:46I mean, DNA, think about how much we know about DNA right now, but then think ahead a year, two, five years.
04:56The DNA doesn't change.
04:58So that data has, that's solid gold that becomes, that goes up in value, like the value of gold itself.
05:05It just keeps going up in value.
05:06Not only this, in 2020, the New York Times reported that Google offered to help manage a national DNA database, voluntarily and at no cost.
05:15But this isn't just about test tubes and lab coats.
05:19Google also owns Nest, the smart thermostat company.
05:22In 2019, Business Insider revealed that several Nest devices included undocumented microphones, without any disclosure or public announcement.
05:31They were always on hardware in people's homes.
05:34Google called it a mistake.
05:35Critics called it a glimpse into a system that already sees and hears more than you think.
05:39Here's the truth.
05:41We didn't lose our privacy.
05:43We gave it away.
05:45For cheaper gadgets.
05:46For cleaner dashboards.
05:47For one more feature on our wrist.
05:50But now, Google doesn't just know where you go or what you watch.
05:54It knows what your body is made of.
05:56And where your bloodline leads.
05:58You thought you were signing up for convenience.
06:01What you really signed up for was ownership.
06:03They just start out innocently.
06:06And then someone or other realizes that they can use this to change people's thinking and behavior.
06:13Let me tell you this.
06:15Before Google was a household name, before Chrome, before Gmail, before the maps in your pocket, it was just an academic experiment at Stanford.
06:23But even back then, powerful eyes were watching.
06:26I think we have to be fair here.
06:28The intelligence agencies, we're talking the 90s, so there was barely an internet.
06:34And the intelligence agencies, they had meetings about this growing internet thing.
06:40And they were worried about how it could pose a threat to national security.
06:45And one very legitimate concept that they were working with was that if there's more and more information posted out there, and someone wants to build a bomb, that's going to be the first place they go.
06:59They're not going to go to their public library.
07:01They're going to go to this new internet, which at that point was easy to use anonymously.
07:07So, when they were talking and, to some extent, providing funding to people building these new search engines, which are just basically big indices, indexes, to what's on the growing internet, they basically were saying, we want to be able to track people who are looking stuff up, especially certain stuff.
07:31In his 2018 book, Surveillance Valley, journalist Yasha Levine exposed how early internet infrastructure was built for tracking instead of browsing.
07:40Levine traced Google's origins back to the Massive Digital Data Systems, or MDDS, initiative, a project backed by intelligence agencies like DARPA, the NSA, and even the CIA.
07:51The goal?
07:52Build tools to collect, analyze, and predict human behavior online.
07:56Google, yes, they started out with help from intelligence agencies.
08:01Yes, they were definitely encouraged to preserve search histories.
08:08One of those tools was Google's now famous algorithm, PageRank.
08:12On the surface, it sorted websites by importance, but at its core, it measured intention.
08:17What people cared about, who they trusted, what they were likely to click next.
08:22It was a calculated prediction engine for desire.
08:24The deeper you go into Google's early funding and academic partnerships, the more it starts to look less like a tech company and more like a behavioral weapon.
08:33One designed not just to organize information, but to study, steer, and eventually shape the species that created it.
08:41And somewhere along the way, it stopped being a tool and became the master.
08:46If you now think that's the end of the story, it's not.
08:50We're only halfway down the rabbit hole.
08:52In 2019, a Google insider broke the silence.
08:56Zach Voorhees, a senior engineer at the company, walked out with over 950 pages of internal documents and handed them to Project Veritas.
09:04The files weren't theories.
09:06They weren't drafts.
09:08They were operational systems with names like YouTube blacklists, controversial search queries, fake news watch, and fringe ranking classifiers.
09:17Each of them was designed to do the same thing.
09:20Control visibility.
09:22Quietly.
09:22Without notice.
09:24If a video, article, or creator didn't fit the preferred framework, it wasn't banned outright.
09:29It was just pushed into digital obscurity.
09:32You could still say what you wanted.
09:33You just wouldn't be seen.
09:35When the leak went public, Google didn't deny these systems existed.
09:39They called them part of their quality control infrastructure.
09:43But who decides what qualifies?
09:45What counts as truth?
09:47What gets ranked or removed?
09:49Dr. Epstein reveals it all.
09:51There's a section there on how Google makes those decisions.
09:56And I'm quoting from an internal document at Google.
09:59It's about a 100-page document that's meant to train people to make those decisions about what content to suppress.
10:07So, I quote actual language from this internal training manual, and you're just kind of guessing, but when you actually see what's in the training manual, you don't have to guess anymore.
10:23Because what it says over and over and over again, I think 22 times, over and over again, it says, well, when you can't quite make up your mind, just use your best judgment.
10:31It gives tremendous personal discretionary authority to the people making these decisions.
10:39And those people are used to train the algorithms that are now making a lot of those decisions.
10:46And then, for 40 minutes, the masks slipped, the filters fell away, and the world saw what it really means when Google decides to flip the switch.
10:56It happened on January 31st, 2009.
10:59For just over 40 minutes, Google flagged every website on the internet as harmful.
11:04Every single link was marked with a digital stop sign.
11:07Millions of users froze.
11:09Platforms crashed.
11:10The warning?
11:11This site may harm your computer.
11:13But the sites weren't harmful.
11:15It was Google itself throwing the switch.
11:18The company later called it a human error, a rogue update in their safe browsing tool.
11:23But some cybersecurity analysts weren't convinced.
11:26They saw the event as something else entirely.
11:29A fail-safe test.
11:30A controlled experiment to see what would happen if Google suddenly, decisively, pulled the plug.
11:36At that time, stock markets were closed.
11:38The damage was minimal.
11:40But the message couldn't have been clearer.
11:42Google doesn't just connect the world.
11:45It can disconnect it at will.
11:47And the scariest part?
11:48It didn't need to touch your files or your thoughts.
11:52Just one silent update.
11:54And reality as you knew it was offline.
11:57So when a courtroom finally called Google to the stand, you'd think the right questions would follow.
12:02They're still in the remedy phase of two big federal trials.
12:06Yes.
12:06Those are going to determine, very possibly, those might bring about some big changes in the company and possibly enormous loss of revenue.
12:15But they didn't.
12:16Because sometimes a trial isn't about justice.
12:20It's about keeping the real crimes off the record.
12:22These cases are shams.
12:26Oh.
12:26They were designed, as far as I'm concerned, it is my opinion, that these cases were designed by Google's lawyers.
12:34In January 2023, the U.S. Department of Justice filed a high-profile lawsuit against Google.
12:40On paper, it looked serious.
12:42An antitrust case accusing the tech giant of monopolizing the digital ad market.
12:47News anchors debated it.
12:48Headlines called it a reckoning.
12:49But the questions no one asked were the most important ones.
12:54About surveillance, psychological profiling, and mind manipulation.
12:58That part was left untouched again.
13:01Dr. Robert Epstein said it best in a 2023 podcast interview with The Hill.
13:06Even if they're forced to sell Chrome, Google wins.
13:10It gives the illusion of accountability while hiding the real levers.
13:13So what would happen?
13:14You're going to force them to sell off Chrome?
13:16I said, who cares?
13:20I said, if you force them to sell off Chrome, they'll get $500 billion in cash for the sale.
13:28Because they do get paid.
13:30So you force them to sell off Chrome.
13:33I said, and they still get all the data.
13:35He said, how would they get all the data?
13:37I said, well, whoever now owns Chrome is still going to use Google's quarantine list
13:43to check and make sure every website is safe before they take anyone there.
13:47So Google gets all the same information, or most of the same information.
13:53Or Google could pay them.
13:55They could have a backdoor that they didn't even know about.
13:57They could also build in a hundred backdoors.
13:59He's not a politician.
14:00He's not selling a belief system.
14:02What he's trying to preserve is something a lot more fragile, your ability to think for
14:07yourself.
14:09So here's what we're left with.
14:10Not just a search engine, but a silent architect of thought.
14:14A system that watches, nudges, rewrites, and calls it convenience.
14:19Dr. Epstein didn't expose a glitch in the matrix.
14:22He showed us the matrix is real.
14:25And if we don't start questioning what's being shown to us, the next belief they rewrite
14:29might be your own.
14:30Before I testified, a high-ranking executive from Google testified.
14:36And the man, under oath, was asked by, I think, Senator Josh Hawley, does Google have
14:42any blacklists?
14:44And he replied, no, Senator, we do not.
14:49Three weeks later, that is when Voorhees documents are made public.
14:57And there's a bunch of them called, that are labeled blacklists.
15:02Now, think of the arrogance.
15:06Think of the arrogance.
15:08If I were running Google, excuse me, but if I were running Google and I was, I had a bunch
15:12of blacklists, I would not call them blacklists.
15:15I might call them shopping lists.
15:17You know, anything but blacklists.
15:20I'd call them purple lists.
15:22Yeah.
15:23According to him, it isn't about market dominance anymore.
15:26It's about consciousness dominance.
15:28Because if a company owns the way people think, what does it matter who owns the software?
15:33The lawsuits feel loud, but the manipulation remains quiet.
15:38And that's by design.
15:40What the world sees is a regulatory battle over money and ads.
15:44What's actually happening is deeper.
15:47It's about a system preserving its power while the justice system trims a few leaves off the
15:51branches.
15:52And you can't fight what you can't see.
15:55That's why Dr. Epstein built a surveillance system of his own.
15:58Not to monitor people, but to monitor Google.
16:01He calls it the monitoring project where agents simulate real users.
16:05They browse, they search, they scroll, all while the system logs what Google shows them.
16:10No theory, just data.
Be the first to comment
Add your comment

Recommended