Israeli Prime Minister Benjamin Netanyahu confirmed in public comments that his goal for Israel’s next military offensive in Gaza is the takeover and full Israeli occupation of the Palestinian territory. Netanyahu’s plan has faced resistance from the military since heavy Israeli troop casualties are expected, and the Israeli captives in Gaza will likely be harmed or killed during the offensive.Israel’s offensive would end quickly if the US stopped supporting it. However, President Trump, when asked about Israel’s full occupation of Gaza, said, “I really can’t say. It is going to be pretty much up to Israel.”
There have been numerous images and reports about starvation and famine in Gaza. When asked, “To what extent are you personally troubled or not troubled by the reports of famine and suffering among the Palestinian population in Gaza?”, 79% of Israeli Jews responded that they were “not so troubled” or “not troubled at all.”
Israeli officials previously announced a plan to build a concentration camp in a tiny area of southern Gaza with the goal of forcing the entire civilian population into it. Netanyahu’s ultimate goal ios the removal of the Palestinian people from Gaza, which they. now call the Trump plan. No regional countries have stepped forward to take in the Palestinians who are being pushed out.
Jimmy Dore explained that no matter what Hamas does, Israel is pursuing land grabs for ‘Greater Israel’, which has been planned for years. He said that the war was used as a pretext to clear Gaza and take the land.
.
From The Jewish Independent:
Most Israeli Jews untroubled by reports of Gaza famine, survey finds
A new Israeli public opinion survey has revealed a sharp divide between Jewish and Arab citizens regarding the humanitarian crisis in Gaza, the credibility of the IDF’s reports, settler violence in the West Bank, and concerns over rising antisemitism abroad.
Conducted by the Viterbi Family Centre for Public Opinion and Policy Research at the Israel Democracy Institute, the survey followed increasing reports and images pointing to a severe humanitarian disaster in Gaza, including widespread famine.
When asked, “To what extent are you personally troubled or not troubled by the reports of famine and suffering among the Palestinian population in Gaza?”, 79% of Israeli Jews responded that they were “not so troubled” or “not troubled at all.” In contrast, 86% of Arab Israelis said they were either “very troubled” or “somewhat troubled” by the situation.
These findings align with coverage in the Israeli mainstream media, which for months largely denied or downplayed the scale of hunger in Gaza. However, according to Ruth Margalit in The New Yorker, a shift may be underway. “Even for (Israeli) politicians and journalists who are sympathetic to Netanyahu, it has become permissible to acknowledge that [the hunger crisis] is real,” she wrote. Whether this softening in tone will influence public opinion remains to be seen.
A video trending on social media this week shows a woman breaking down over her student loans:
This isn’t a failure of personal responsibility. This is the designed outcome of a system built to drain your wallet.
After I published my previous essay, readers shared their own moments of realization. Mine came years ago at a car dealership when I tried to pay cash and they looked horrified. I was proud I’d saved enough to pay in full, but that pride turned to confusion when they treated my cash like a problem to be solved. The salesman spent ten minutes trying to convince me to finance at some absurd rate, and I left confused. That’s when I understood – they don’t want transactions anymore, they want relationships. Permanent, extractive relationships.
The woman in the video – her 17% student loans and my confused car dealer are the same system – built to keep us paying forever. Both scenarios reveal the same truth: the economy has been restructured to prefer debt over ownership, subscription over purchase, permanent extraction over finite transactions. She’s paying $1,500 monthly on loans that only grow. She’s not failing the system – the system is rigged against her.
Not that long ago, I genuinely believed fractional ownership could democratize access to assets. Coming from tech, I was naive about who would control these systems and how they’d be weaponized. What I documented in The Boomer Mirage (which showed how ownership was systematically priced out of reach) was just the setup. Today I want to show you the punchline: how the people promising “you’ll own nothing and be happy” engineered a world where they own everything and get rich.
The American Dream wasn’t killed – it was privatized.
The New Paradigm Revealed
The goal was never a secret. It became the defining mantra of the era, famously captured by tech analyst Tom Goodwin in 2015: “Uber, the world’s largest taxi company, owns no vehicles. Facebook, the world’s most popular media owner, creates no content. [Amazon], the most valuable retailer, has no inventory. And Airbnb, the world’s largest accommodation provider, owns no real estate. Something interesting is happening.”
What was ‘interesting’ was a sophisticated money grab disguised as innovation.
The Bait and Switch
It took me years to realize they hadn’t just priced us out – they’d rebranded exclusion as lifestyle choice. Cultural institutions aided the switch. Magazines, TED Talks, influencers all praised “freedom from stuff.” Suddenly ownership became materialist while minimalism evolved.
The pitch was seductive. “What if you could share your city?” Airbnb’s founder asked at a 2016 TED Talk. They sold the idea of belonging and overcoming stranger-danger bias. But the financial model wasn’t about sharing – it was about creating a global platform to monetize spare rooms and, eventually, entire homes, turning community assets into revenue streams for distant shareholders, one 15% service fee at a time.
People were taught to think “Why would I want to be tied down to a mortgage?” without realizing they were choosing permanent rent instead. The pitch felt liberating on the surface, but step back and the timing reveals everything. This wasn’t accidental messaging. The rebranding happened precisely as ownership became mathematically impossible.
The Generational Handoff
When I mapped the timeline from my previous analysis, the coordination became obvious. As rates fell during the Boomer era, many in my parents’ generation built wealth through ownership – buying homes, owning cars outright, financing that decreased over time. Gen X caught the tail end of that system. Millennials and Gen Z were offered ‘access’ instead.
The generational trends are stark. While Boomers participated in systems that built wealth, younger generations largely participate in systems built to extract wealth – renting everything, subscriptions forever, financing that never ends.
A few years ago, I did an audit of my finances and realized I was paying $400 monthly for software I used to own. Adobe Creative Suite, which I’d bought once for $600, was now costing me $240 annually forever. That’s when the pattern became undeniable. This wasn’t market evolution – it was coordinated replacement of an economy they deliberately broke. The same institutions that killed homeownership now profit from the rental economy that replaced it.
The $3,000 Nowhere
My younger cousin makes $65,000 a year – decent money by most standards. He showed me where his money goes each month: $1,800 rent, $600 car lease, $400 in subscriptions, $200 in various app fees. That’s $3,000 monthly going purely to access and subscriptions – with zero assets to show for it.
His grandfather’s $3,000 monthly would have bought a house, built equity, created generational wealth – even adjusting for inflation. His $3,000 disappears into other people’s portfolios every month. This isn’t coincidence – it’s wealth extraction disguised as convenience.
When Fractional Ownership Works
I’ve seen fractional ownership work – when the community controls it. Community investment pools where local capital stays local. Cooperative models where members build actual equity stakes. Tool libraries with ownership shares. Community land trusts where members gain wealth while preventing speculation.
I became fascinated with DAOs and liquidity pools in 2020-21 because they seemed to offer genuine community ownership. But governance turned out to be the killer app – who controls the system determines whether it builds wealth for participants or extracts it.
The difference isn’t the technology – it’s who captures the value. These models work because participants gain equity, not just access.
When It Becomes Systematic Extraction
The math is simple and brutal. I tracked Airbnb’s money flow: host gets $100 per night, platform gets ~$15, community loses significant housing stock value. My car research showed leasing versus buying over ten years: $60,000 in payments versus $35,000 purchase with $15,000 residual value. I realized I’d paid Adobe $2,400 over ten years for what used to cost $600 once.
That car dealership epiphany became my lens. I started seeing the same financing-over-ownership push everywhere – local wealth flowing to distant platform owners. Every industry had flipped the same way. The “sharing economy” didn’t emerge randomly. It launched precisely as ownership became unaffordable. The founders weren’t hiding their extraction model – they were celebrating it.
The vision was laid bare in public filings like WeWork’s. Their mission wasn’t just to rent desks, but to create a “new ecosystem for how we work, live and grow.” They sold “access” to “community” and “inspiring spaces” – all intangible concepts – while capturing hard financial value from long-term leases. It was the perfect model: take on long-term assets, slice them up, and rent them back to a generation that could no longer afford them.
The Rent Is Watching You
But there’s another benefit of this model for those who oversee it: unprecedented data extraction. Rental relationships generate surveillance that ownership never did. Every transaction becomes trackable, every behavior monetizable. Car leases track where you drive, software subscriptions monitor usage, streaming services record preferences.
The pattern is clear from digital surveillance systems – rental often means monitoring. The data extraction isn’t accidental – it’s the business model. Your information becomes another revenue stream while you get poorer. Total visibility is the hidden cost of never owning anything.
The Debt Trap Amplifier
But the problem is deeper than cash flow. It’s about the systemic preclusion from building equity. The psychological weight of this system is crushing – watching your payments build someone else’s equity while you stay trapped. Student loans plus housing costs lock entire generations into permanent renter status.
This isn’t accidental. The debt trap feeds the rental economy perfectly: Can’t buy → must rent → wealth flows up → even less able to buy. It’s a self-reinforcing cycle designed to convert ownership into access, assets into subscriptions.
The Exit Strategy
The system may be rigged, but alternatives exist. Here’s what people can actually do:
Join existing community programs – Community land trusts, cooperative housing projects, local investment pools that keep wealth in the neighborhood.
Start cooperative buying groups – Pool resources with neighbors to purchase tools, equipment, even vehicles collectively with shared ownership stakes.
Investigate equity-building alternatives – Community-supported agriculture with ownership components, local time banks that build relationships and shared value.
Support platform cooperatives – Driver-owned alternatives to Uber, host-owned alternatives to Airbnb, cooperative alternatives to extraction platforms.
These aren’t utopian theories – they’re working models already building real wealth for participants instead of distant shareholders.
The Choice
Understanding the extraction machine is the first step toward starving it. The technology isn’t the problem – who controls it is. The same urgency from my previous analysis applies here: the outcome isn’t predetermined, it’s being decided right now.
Every “sharing economy” innovation should face one question: Who actually gets rich? We can build alternatives or keep enriching the extractors.
They’ve designed a system where they’ll own everything and be rich while you own nothing. But we can design something better.
The changes crept in so gradually that most people didn’t notice. Your freedom to travel now depends on having the right QR code. Your bank monitors your purchases and reports suspicious patterns to government agencies, all under the banner of safety and security. Social media platforms flag your posts as ‘misinformation’ if they question official policies, while your children learn in school that ‘individual rights’ must always be ‘balanced’ against ‘collective responsibility’. When you visit your doctor, everything you say gets entered into databases shared across agencies and institutions you’ve never heard of.
These aren’t separate policies responding to different problems. They’re connected pieces of a single framework that treats you not as a free human being, but as a data point to be monitored, measured, and managed for the stability of a larger system. The framework has a name, a structure, and a timeline that was laid out in remarkable detail nearly eight decades ago.
The 1947 Blueprint
IIn 1947, Alice Bailey published The Externalisation of the Hierarchy1, a book that most readers readily dismissed as fringe esoteric speculation. But Bailey wasn’t making predictions — she was documenting a plan. Writing with the clinical precision of someone with inside knowledge, she described exactly how human civilisation would be reshaped over the following decades. Her book reads less like prophecy and more like a project timeline — complete with phases, methods, target dates, and operational structures.
Bailey laid out a systematic approach to planetary transformation that would unfold over roughly 78 years. The plan wasn’t to destroy existing institutions, but to infiltrate and repurpose them from within, keeping their familiar names and symbols while redirecting their fundamental purpose toward global control. She described the construction of ‘triangular networks’ that would later link government, business, and civil society into unified command systems — precisely the public-private partnerships, multi-stakeholder governance structures, and UN coordination bodies that now dominate global decision-making. Global crises would serve as accelerators, creating the psychological conditions necessary for populations to accept rapid changes that would normally take decades to implement.
Writing with remarkable specificity about the timeline, Bailey stated:
Thus a great and new movement is proceeding and a tremendously increased interplay and interaction is taking place. This will go on until A.D. 2025. During the years intervening between now and then very great changes will be seen taking place, and at the great General Assembly of the Hierarchy—held as usual every century—in 2025 the date in all probability will be set for the first stage of the externalisation of the Hierarchy. The present cycle (from now until that date) is called technically ‘The Stage of the Forerunner’.
Her esoteric terminology masked what was essentially the same systems management architecture that would later emerge through McNamara’s Planning-Programming-Budgeting Systems and evolve into today’s global governance framework — the difference being that she understood it as spiritual hierarchy while technocrats would frame it as systems theory for scientific administration.
The ultimate goal was a planetary management system where unelected experts would make decisions for everyone, justified by appeals to collective good and scientific necessity. Advanced technology, data systems, and psychological techniques would monitor and shape human behavior on a global scale. Bailey wrote that a ‘decisive first stage’ of this transformation would be completed by 2025, marking the moment when this hidden network would stop working behind the scenes and begin openly directing world affairs.
In 2025, 194 nations agreed on the final wording of the core aspects of the WHO Pandemic Agreement, establishing a framework that is expected to give international health officials binding legal authority to override national governments during declared emergencies. Crucially, these ‘emergencies’ are not limited to actual disease outbreaks, but include computer-modeled hypothetical scenarios based on potential pandemic drivers — which, under the One Health framework, encompasses climate change, biodiversity loss, and virtually any environmental condition that algorithms determine might theoretically contribute to future health risks. For the first time in human history, unelected global bureaucrats gained the power to suspend individual rights based on predictive models rather than actual events.
The 78-year timeline was complete, right on schedule.
The Three-Step Transformation
The transformation required a fundamental shift in how human beings understand themselves and their relationship to authority. This shift happened in three overlapping phases, each building on the previous one to create the philosophical and practical foundations for global management.
The first phase involved removing higher truth from human consciousness. As long as people believed in God, natural rights, or moral absolutes, they would resist accepting human authority as final. The solution was a decades-long cultural campaign to convince populations that nothing exists beyond what can be measured and managed by experts. Science was transformed from a method of discovery into the ultimate moral authority, while education systems taught children that ethics were subjective opinions rather than universal truths. Once people stopped believing in transcendent sources of meaning, concepts like ‘human dignity’ became negotiable — defined by whoever controlled the institutional apparatus.
The second phase established official institutions as the only valid source of information about reality. Even without belief in higher truth, people might still think for themselves and reach different conclusions about policy or governance. The solution was to position dissent itself as a form of ignorance or extremism. Questioning official narratives became synonymous with spreading ‘dangerous misinformation’ or ‘endangering our democracy’. Media organisations, technology platforms, and academic institutions coordinated to ensure that populations heard a single, unified story on every major issue. The shift was subtle but decisive: asking questions about policy stopped being called ‘healthy skepticism’ and started being labeled immoral ‘anti-science’.
The third phase deployed the technological and legal infrastructure necessary to enforce compliance without appearing overtly totalitarian. Surveillance systems monitor behavior in real time, algorithms predict and prevent dissent before it can organise, and the eventual social credit systems reward compliance while punishing resistance. Emergency powers bypass normal democratic processes, allowing rapid implementation of restrictions that would be impossible under normal legislative procedures. People become components — cogs in the machine — designed for nothing short of maximum system efficiency.
The Timeline of Implementation
The transformation didn’t happen overnight. It followed a carefully planned sequence that can be traced through public documents, policy changes, and institutional developments over the past six decades.
The foundation was laid between 1961 and 1965 when Defense Secretary Robert McNamara introduced Planning-Programming-Budgeting Systems to the military, then President Johnson expanded this systems-based management approach across the entire federal government. This marked the moment when government stopped being primarily about serving people and started being about managing data flows and optimising systemic outcomes.
The concept of planetary management emerged between 1968 and 1972 through a series of international conferences and agreements. The UNESCO Biosphere Conference established the framework for treating Earth as a managed ecosystem requiring centralised administration. The Club of Rome formed during this period and soon published warnings of planetary collapse without coordinated global control. In a remarkable development, the United States and Soviet Union — supposedly locked in existential conflict — collaborated to create the International Institute for Applied Systems Analysis, who ultimately became responsible for ‘black box’ global modelling. This demonstrated that Cold War enemies could unite around planetary management objectives, later to become ‘Planetary Boundaries’, while the UN Conference on the Human Environment in Stockholm cemented the idea that Earth needed centralised administration to prevent ecological collapse.
The 1980s and 1990s saw the conversion of ethics itself into a tool of global governance. ‘Rights and responsibilities’ frameworks began replacing absolute human rights in international law and academic discourse, with Leonard Swidler positioning these as the middle principles leading to Global Ethics. The Earth Summit embedded ‘sustainable development’ as a moral duty that could override traditional notions of sovereignty, while installing ‘soft law’ controls on carbon emission and sequestration through the UNFCCC and Convention on Biological Diversity. Global business ethics codes aligned corporate behavior with planetary goals rather than local communities or shareholders, an initiative later turbocharged as Enron collapsed in late 2001. Universities established degree programs in ‘global governance’, training the future expert class that would eventually run these systems.
Between 2001 and 2015, the philosophical groundwork was translated into operational policy. WHO ethics papers redefined ‘human dignity’ from an inherent right to something earned through compliance with collective objectives. The ‘One Health’ framework merged human, animal, and environmental governance into a single administrative domain while academic conferences and think tanks normalised the idea that individual rights could be suspended during emergencies for ‘the greater good’. By 2015, international organisations had official ethical frameworks that explicitly authorised overriding personal freedoms when experts determined it was necessary — with no realistic possibility of appeal.
The infrastructure was completed between 2015 and 2019 as surveillance systems, digital identity platforms, and emergency response protocols moved from pilot programs to operational readiness. International emergency protocols were harmonised across countries, though the details remained buried in technical annexes and working group reports that few people read. Everything was in place for activation when the right crisis presented itself.
COVID-19 provided that crisis in 2020, serving as the first global test of the new system. Emergency powers bypassed normal legislative processes, digital health passes demonstrated that populations would accept compliance-based freedoms, and government agencies, media organisations, and technology platforms operated with unprecedented coordination, seeking to censor any divergent point of view with strategic decision. The system worked exactly as designed — at least for a while, though a compliant police force was militarised against the people who objected.
Between 2021 and 2024, ‘temporary’ emergency measures became permanent features of governance. Legislative changes quietly extended emergency powers to cover climate change, artificial intelligence risks, and other global issues. International treaties and public-private partnerships fused health, finance, and environmental control into an integrated global management architecture. The pattern was established: each crisis expands the system’s reach, with climate emergencies, AI safety threats, and cybersecurity incidents already positioned as the next triggers for expanded global coordination.
The WHO Pandemic Treaty signed in 2025 represents the completion of this 78-year process. At present trajectory, international officials will eventually come to possess binding authority over national governments during declared emergencies — exactly as Bailey had outlined. The ‘externalisation’ is complete — global governance operates openly rather than behind the scenes.
The Choice Before Us
Understanding this history clarifies the choice we face. We are not heading toward this system of global management — we are already living within it, though it hasn’t yet had time to fully set. The question is whether we will accept it as inevitable and beneficial, or whether we will work to restore governance based democratic accountability while we still can.
Accepting the system means embracing a future where rights depend on compliance scores, where algorithms make decisions once reserved for human judgment, and where global bureaucrats can override local representatives whenever they declare an emergency that cannot be challenged. It means raising children who understand freedom as permission granted by authorities rather than an inherent birthright.
Rejecting the system requires rebuilding institutions based on different principles — transparency in emergency powers, genuine democratic consent for international agreements, full transparency and genuine accountability for public officials committing crimes, and recognition that human dignity cannot be conditional on compliance with expert recommendations. It means supporting alternatives that prioritise humanity over system efficiency, and teaching the next generation that rights do not derive from being well-behaved.
This is not a partisan political issue. People across the traditional political spectrum should recognise the difference between governance that serves the people and management that treats people as data points to be optimised. The system transcends conventional politics because it operates at the level of fundamental assumptions about human nature and the proper relationship between individuals and institutions.
The transformation succeeded because it happened gradually, then suddenly. For decades, each change seemed reasonable in isolation. But the cumulative effect has been to create a system where human agency is increasingly replaced by algorithmic authority, where local control gives way to global management, and where individual rights become conditional privileges.
The people who designed this transformation understood that change happens through accumulated precedents rather than dramatic reversals. They also understood that systems depend on participation. The global management apparatus requires local compliance to function effectively. This creates opportunities for resistance that don’t depend on controlling national governments or international organisations.
Every individual choice to resist redefinitions of basic concepts like freedom and dignity contributes to a larger cultural shift. Supporting businesses and organisations that operate according to human-centered rather than data-centered principles creates alternative networks. Engaging in local governance where human relationships still matter more than algorithmic optimisation builds foundations for different kinds of institutions.
The next crisis will undoubtedly be used to expand control further, just as previous crises have been. But understanding the pattern makes it possible to resist the psychological manipulation that accompanies emergency declarations. Knowing your rights before they’re suspended ‘temporarily’ — even if this is promised to be for only ‘two weeks’ — creates space for a response rather than mere reactiongloba
The 78-year plan succeeded because most people didn’t know it existed.
Now that it’s visible, the choice is ours: participate in our own management, or remember what it means to govern ourselves.
Same Self Replicating Nanotechnology Spheres Seen In C19 Unvaccinated Living Blood As In Deceased Embalmed C19 Vaccinated Blood With Rubbery Clots – What Will Humanity Do About This?
Image: Deceased blood with self replicating nanotechnology
I have shown my microscopy of embalmed blood of an individual who had died 8 months earlier with the long rubbery clots everyone that should be familiar with by now. They were featured in the documentary “Died Suddenly”.
The spheres that I filmed continue to replicate – and in the above image you can see how the spheres continue to develop. In the below video, I looked at the spheres at 4000x magnification. Clearly light emitting micro robots are seen. Considering the size of the surrounding red blood cells they are estimated to be several hundred nano meters in size.
I had shown how the spheres contain nano and micro robots that are emitting light in different frequencies and colors, as Quantum Dot polymer coated bidirectional biosensors would do – and have previously photographed them in C19 unvaccinated blood. I have also shown how these spheres construct the filaments we see in the blood.
Image: C19 unvaccinated blood sphere filled with Nano technology
Image: C19 unvaccinated blood sphere filled with Nano technology building hydrogel filaments
I have a lot of conversations with people. Still many are in denial. Doctors do not believe in nanotechnology and deny its existence. Supposedly people are still not ready to hear about nanotechnology that has self disseminated via shedding worldwide and is now causing the rubbery clots. We have proven that. They are made of rubber like material, a polymer that has self assembled.
Image: Rubbery Clot Development In C19 Unvaccinated Individual With Previous Deep Vein Thrombosis and Massive Pulmonary Emboli – While On Eliquis, Nattokinase, Lumbrokinase and Serreptase
My question is when will the denying doctors, scientists, politicians, attorneys and media decide people will be ready to know about this and start discussing it? When they are dead?
This is a live blood analysis of an individual who after air travel started having upper respiratory tract symptoms suggestive of “Covid”. I have in previous posts shown that acute Covid symptoms correlated with significant replication of the hydrogel filaments and excessive rouleaux formation. Therapeutic dose Ivermectin quickly resolved the symptoms. In live blood analysis one can see that Ivermectin helps resolve the rouleaux, but does not diminish micro bots or hydrogel production. The individual now had a follow up live blood analysis a week later.
I filmed how the blood was being transformed by the same spheres filled with nanotechnology. The blood was loaded with these spheres. You can see the movement and the optical light emission. Red blood cells surrounding this are in severe oxidative stress – they are dying. This is the same magnification as the deceased C19 vaccinated blood above – Oil Objective 4000x. While the nano robots are tiny, you can see them emitting light and moving.
Here is another view:
Here is the same blood with many of these small spheres that are between 5-10 micron in size but can become much larger. The blood is transformed into a polymer mesh network.
Image: C19 unvaccinated blood 200x magnification
These spheres in the blood are not air bubbles. They are hydrogel nanotechnology construction sites. You can see in the video below many spheres that are perfectly round interconnecting and extracting the life out of the red blood cells.
The blood cells that clearly show oxidative stress, are being transformed under the coordinated effort of micro robots. These can be recognized by its blinking lights coordinating smaller nano robots. If you think that one red blood cell is about 5-7 micrometers, some of the very small swarming nano bots are estimated around 500 nanometers. Watch the blinking lights, those are robots.
Summary:
I talk to people, give interviews, work on legal strategies as much as I can, and give of my time freely in the hopes the world will become aware of this threat. The response after all of these many months of writing hundreds of substacks is still remarkably apathetic. I seriously am asking the below question to those who do not want to bring this knowledge to the forefront.
Addressed are all the colleagues, organizations and legal representatives that are presumably fighting on the same side of history as I am in the “freedom movement” – and whose antidotes I look at under the microscope and find no effect on the nanotechnology:
With all consideration for the future of humanity in mind, when do you think people are ready to hear about this?
The answer – when they are dead – is a bit late in my opinion.
I invite all to take another look at the evidence, and consider the ramification of these findings.
The White House has just released its official policy document, America’s AI Action Plan, defining the future of AI development. Admittedly, Trump doesn’t have any real understanding of AI, but he has totally caved in to the Technocrats he appointed in the first place. Indeed, Technocracy is being forced down our throats whether we want it or not.
The first pillar of America’s AI Action Plan focuses on removing regulatory barriers and eliminating unnecessary review processes. Superficially, this appears as a push against bureaucratic inertia, but in reality, it amounts to an explicit transfer of authority from elected bodies to expert committees and interagency working groups.
The second pillar includes a comprehensive scheme for AI literacy and workforce retraining. At first glance, investment in skill development and rapid-response training may appear benevolent. Yet the Plan prescribes a narrowly defined set of competencies—data labeling, model auditing, grid operations—determined by federal projections of industrial demand. Such top-down workforce engineering tracks precisely with technocratic ideology, which regards citizens as variables in an optimization problem. Rather than empowering individuals to shape their own vocational paths, the Plan channels labor into predetermined slots within a digital economy overseen by experts.
The third pillar of the report extends the domestic technocratic agent to the world. By exporting American AI frameworks, hardware standards, and regulatory templates to allies, the Plan seeks to cement a global regime of expert rule.
The last item on the last page of the Plan contains real paydirt for Technocracy and Transhumanism:
AI will unlock nearly limitless potential in biology: cures for new diseases, novel industrial use cases, and more. At the same time, it could create new pathways for malicious actors to synthesize harmful pathogens and other biomolecules. The solution to this problem is a multi-tiered approach designed to screen for malicious actors, along with new tools and infrastructure for more effective screening.[Remember nose swabs for COVID screening? – Ed.]As these tools, policies, and enforcement mechanisms mature, it will be essential to work with allies and partners to ensure international adoption.
Recommended Policy Actions
Require all institutions receiving Federal funding for scientific research to use nucleic acid synthesis tools and synthesis providers that have robust nucleic acid sequence screening and customer verification procedures. Create enforcement mechanisms for this requirement rather than relying on voluntary attestation.
Led by OSTP, convene government and industry actors to develop a mechanism to facilitate data sharing between nucleic acid synthesis providers to screen for potentially fraudulent or malicious customers.
Build, maintain, and update as necessary national security-related AI evaluations through collaboration between CAISI at DOC, national security agencies, and relevant research institutions.
Therefore, DNA screening will become commonplace across government agencies.
Who Wrote This Technocratic Screed, Anyway
Not surprisingly, the report’s lead authors are listed as Michael Kratsios and David Sacks, with Secretary of State Marco Rubio included as an official with clout.
Michael Kratsios, Technocrat
Currently, Kratsios is listed as Assistant to the President for Science and Technology. In the first Trump Administration, he served as the Chief Technology Officer (CTO). Appointed in August 2019 at age 33, he was the youngest person ever to hold the federal CTO position.
In this role, he led the White House Office of Science and Technology Policy’s efforts to advance emerging technologies—most notably artificial intelligence, 5G wireless networks, quantum computing, and data privacy—across the federal government. He coordinated interagency AI initiatives, helped develop the American AI Initiative, and convened industry, academic, and civil-society stakeholders to guide national technology policy.
David O. Sacks, Technocrat
Sacks is listed as Special Advisor for AI and Crypto. He was a co-founder and the first Chief Operating Officer (COO) of PayPal, alongside Peter Thiel and Elon Musk. As such he was a prominent member of the so-called “PayPal Mafia.” He is heavily invested in the AI industry through his company, Craft Ventures.
Sacks’ authority is questionable. He was originally listed as a “Special Advisor to the President” under a protocol that ran for 133 days, which has long expired. On this report, his title has changed to “Special Advisor for AI and Crypto.” I conducted an exhaustive search to determine that David Sacks has no current position with any government entity and is, therefore, a private citizen. So, what is his name doing on this report?
Apparently, Sacks is self-appointed to be the “Crypto and AI Czar”. Yes, self-appointed. Today’s arch-Technocrats are so sure of themselves that they don’t need official appointment to assert themselves.
According to our conventional view of history, humans have only walked the Earth in our present form for some 200,000 years. Much of the mechanical ingenuity we know of in modern times began to develop only a couple hundred years ago, during the Industrial Revolution. However, evidence today alludes to advanced civilizations existing as long as several thousand years ago—or possibly even earlier.
“Oopart”—or “out-of-place artifact”—is the term given to numerous prehistoric objects found in various places across the world today that show a level of technological sophistication incongruous with our present paradigm.
Many scientists attempt to explain these ooparts away as natural phenomena. Yet others say that such dismissive explanations only whitewash over the mounting evidence: that prehistoric civilizations had advanced knowledge, and this knowledge was lost over the ages only to be developed anew in modern times.
We will look at a variety of ooparts here, ranging from millions to hundreds of years old in purported age, but all supposedly demonstrating advancement well beyond their time.
Whether these are fact or merely fiction we cannot say. We can only offer a glimpse at what’s known, supposed, or hypothesized regarding these phenomena, in the spirit of being open-minded and geared toward real scientific discovery.
17. 2,000-Year-Old Batteries?
Clay jars with asphalt stoppers and iron rods made some 2,000 years ago have been proven capable of generating more than a volt of electricity. These ancient “batteries” were found by German archaeologist Wilhelm Konig in 1938, just outside of Baghdad, Iraq.
Right: An illustration of a Baghdad battery from museum artifact pictures. (Ironie/Wikimedia Commons) Background: Map of area surrounding present-day Baghdad, Iraq. Cmcderm1/iStock/Thinkstock
“The batteries have always attracted interest as curios,” Dr. Paul Craddock, a metallurgy expert at the British Museum, told the BBC in 2003. “They are a one-off. As far as we know, nobody else has found anything like these. They are odd things; they are one of life’s enigmas.”
16. Ancient Egyptian Light Bulb?
A relief beneath the Temple of Hathor at Dendera, Egypt, depicts figures standing around a large light-bulb-like object. Erich Von Däniken, who wrote “Chariot of the Gods,” created a model of the bulb which works when connected to a power source, emitting an eerie, purplish light.
The light-bulb-like object engraved in a crypt under the Temple of Hathor in Egypt. Lasse Jensen/CC BY 2.5
15. Great Wall of Texas
In 1852, in what is now known as Rockwall County, Texas, farmers digging a well discovered what appeared to be an ancient rock wall. Estimated to be some 200,000 to 400,000 years old, some say it’s a natural formation while others say it’s clearly man-made.
A historic photo of the “wall” found in Rockwall, Texas. Public Domain
Dr. John Geissman at the University of Texas in Dallas tested the rocks as part of a History Channel documentary. He found they were all magnetized the same way, suggesting they formed where they are and were not moved to that site from elsewhere. But some remain unconvinced by this single TV-show test and call for further studies.
Geologist James Shelton and Harvard-trained architect John Lindsey have noted elements that seem to be of architectural design, including archways, linteled portals, and square openings that resemble windows.
14. 1.8-Billion-Year-Old Nuclear Reactor?
In 1972, a French factory imported uranium ore from Oklo, in Africa’s Gabon Republic. The uranium had already been extracted. They found the site of origin to have apparently functioned as a large-scale nuclear reactor that came into being 1.8 billion years ago and was in operation for some 500,000 years.
Nuclear reactor site, Oklo, Gabon Republic. NASA
Dr. Glenn T. Seaborg, former head of the United States Atomic Energy Commission and Nobel Prize winner for his work in the synthesis of heavy elements, believed it wasn’t a natural phenomenon, and thus must be a man-made nuclear reactor.
For uranium to “burn” in a reaction, very precise conditions are needed. The water must be extremely pure, for one—much purer than exists naturally. The material U-235 is necessary for nuclear fission to occur. It is one of the isotopes found naturally in uranium. Several specialists in reactor engineering have said they believe the uranium in Oklo could not have been rich enough in U-235 for a reaction to take place naturally.
13. Sea-Faring Map Makers Before Antarctica Was Covered in Ice?
A map created by Turkish admiral and cartographer Piri Reis in 1513, but sourced from various earlier maps, is thought by some to depict Antarctica as it was in a very remote age before it was covered with ice.
A portion of the Piri Reis map of 1513. Public Domain
A landmass is shown to jut out from the southern coastline of South America. Captain Lorenzo W. Burroughs, a U.S. Air Force captain in the cartographic section, wrote a letter to Dr. Charles Hapgood in 1961 saying that this landmass seems to accurately show Antarctica’s coast as it is under the ice.
Dr. Hapgood (1904–1982) was one of the first to publicly suggest that the Piri Reis map depicts Antarctica during a prehistoric time. He was a Harvard-educated historian whose theories about geological shifts earned the admiration of Albert Einstein. He hypothesized that the land masses shifted, explaining why Antarctica is shown as connected to South America.
Modern studies refute Hapgood’s theory that such a shift could have taken place within thousands of years, but they show it could have happened within millions of years.
12. 2,000-Year-Old Earthquake Detector
In 132 A.D., Zhang Heng created the world’s first seismoscope. How exactly it works remains a mystery, but replicas have worked with a precision comparable to modern instruments.
A replica of an ancient Chinese seismoscope from the Eastern Han Dynasty (25-220 A.D.), and its inventor, Zhang Heng. Wikimedia Commons
In 138 A.D., it correctly indicated that an earthquake occurred about 300 miles west of Luoyang, the capital city. No one had felt the quake in Luoyang and dismissed the warning until a messenger arrived days later, requesting aid.
11. 150,000-Year-Old Pipes?
Caves near Mount Baigong in China contain pipes leading to a nearby lake. They were dated by the Beijing Institute of Geology to about 150,000 years ago, according to Brian Dunning of Skeptoid.com.
A file photo of a pipe, and a view of Qinghai Lake in China, near which mysterious iron pipes were found. NASA; Pipe image via Zhax/Shutterstock
State-run media Xinhua reported that the pipes were analyzed at a local smeltery and 8 percent of the material could not be identified. Zheng Jiandong, a geology research fellow from the China Earthquake Administration, told state-run newspaper People’s Daily, in 2007, that some of the pipes were found to be highly radioactive.
Jiandong said iron-rich magma may have risen from deep in the Earth, bringing the iron into fissures where it may have solidified into tubes; though he admitted, “There is indeed something mysterious about these pipes.” He cited the radioactivity as an example of the strange qualities of the pipes.
10. Antikythera Mechanism
A mechanism often referred to as an ancient “computer,” which was built by Greeks around 150 B.C., was able to calculate astronomical changes with great precision.
The Antikythera Mechanism is a 2000-year-old mechanical device used to calculate the positions of the sun, moon, planets, and even the dates of the ancient Olympic Games. Marsyas/CC by SA 3.0
“If it hadn’t been discovered … no one would possibly believe that it could exist because it’s so sophisticated,” said Mathematician Tony Freeth in a NOVA documentary. Mathias Buttet, director of research and development for watch-maker Hublot, said in a video released by the Hellenic Republic Ministry of Culture and Tourism, “This Antikythera Mechanism includes ingenious features which are not found in modern watch-making.”
9. Drill Bit in Coal
John Buchanan, Esq., presented a mysterious object to a meeting of the Society of Antiquaries of Scotland on Dec. 13, 1852. A drill bit had been found encapsulated in coal about 22 inches thick, buried in a bed of clay mixed with boulders about 7 feet thick.
File image of coal (Kkymek/iStock) File image of a drill Konstik/iStock; edited by Epoch Times
The Earth’s coal is said to have formed hundreds of millions of years ago. The Society decided that the instrument was of a modern level of advancement. But it concluded that “the iron instrument might have been part of a borer broken during some former search for coal.”
Buchanan’s detailed report did not include any signs that the coal surrounding the instrument had been punctured by drilling.
8. 2.8-Billion-Year-Old Spheres?
Spheres with fine grooves around them, found in mines in South Africa, have been said by some to be naturally formed masses of mineral matter. Others have said they were precisely shaped by a prehistoric human hand.
Top left, bottom right: Spheres, known as Klerksdorp spheres, found in the pyrophyllite (wonderstone) deposits near Ottosdal, South Africa. (Robert Huggett) Top right, bottom left: Similar objects known as Moqui marbles from the Navajo Sandstone of southeast Utah. Paul Heinrich
“The globes, which have a fibrous structure on the inside with a shell around it, are very hard and cannot be scratched, even by steel,” said Roelf Marx, curator of the museum of Klerksdorp, South Africa, according to Michael Cremo’s book, “Forbidden Archaeology: The Hidden History of the Human Race.” Marx said the spheres are about 2.8 billion years old.
If they are mineral masses, it is unclear how exactly they formed.
7. Iron Pillar of Delhi
To read the rest, go to: https://www.zerohedge.com/geopolitical/17-out-place-artifacts-suggest-high-tech-civilizations-existed-thousands-or-millions
China has been dumping US treasuries, while Trump has been gunning for Jerome Powell at the Fed, and central bank digital currency (CBDC) was banned by Congress. Trump signed the the Guiding and Establishing National Innovation for US Stablecoins (GENIUS) Act into law. Stablecoins are a type of cryptocurrency that are backed by assets considered to be reliable such as a national currency.US-dollar backed stable coins make US dollars easier to use globally, settle transactions faster, and record transactions on a blockchain ledger. Clayton Moore said that this is how Trump plans to keep the US as the world’s reserve currency, not by printing more money, but by making the dollar more useful than anything else. There is much skepticism and Catherine Austin Fitts criticized stable coins early on, saying that they are a backdoor to central bank digital currency (CBDC).
Fitts said that stablecoin is a way to turn on helicopter money like you have never seen it turned on before. She added, “the guys who control how that money flows can literally buy up the world.” Mark Goodwin warned against the tokenization of real world assets, including land and nature.
.
[Note: Need To Know News does not endorse any investment offers]
China has been dumping US treasuries, while Trump has been gunning for Jerome Powell at the Fed, and central bank digital currency (CBDC) was banned by Congress. Trump signed the the Guiding and Establishing National Innovation for US Stablecoins (GENIUS) Act into law.
Catherine Austin Fitts criticized stable coins as a backdoor to CBDC. Here are a few short videos from Catherine Austin Fitts and Mark Goodwin:
Blockchain is simply a ledger/ accounting system. Mark Goodwin said that digital money can be traced in that public ledger and can be used as a tool for warrantless surveillance.
He explained that a cash dollar bill can only be spent by an individual once because it is given away, whereas a digital dollar could be represented on two separate ledgers that will create more money [and fraud].
Catherine Austin Fitts said that stable coins are the new tool for financial warfare and and a great land grab. She said that stable coin is a way to turn on helicopter money like you have never seen it turned on before. She added, “the guys who control how that money flows can literally buy up the world.”
Digital currencies are not tangible and can be destroyed.
Goodwin warned against the tokenization of real world assets, including land and nature.
A critic wrote, “What is the difference between between stable coin and CBDC? They’re both digital currencies, which is bad in the long term. According to the GENIUS Act, it will eliminate paper bill at the federal level. So what do you think it is going to happen later down road in state level? Cashless.”
James Li investigates whether Peter Thiel is the next Epstein—exposing his ties to private islands, youth blood infusions, and surveillance tech that’s quietly taking over the US government. Thiel had financial ties with Epstein and he held academic symposiums at Epstein’s Little St. James island. Some of the girls that Epstein brought to the island reported that they were the subjects of scientific experiments.
.
Peter Thiel was the co-founder of PayPal alongside Elon Musk. He was also the first major investor in Facebook, which was introduced the same day DARPA ended its lifelog program that was a program that sought to “trace the threads of an individual’s life in terms of events, states, and relationships.” Thiel co-founded Palantir, a data analytics and surveillance company used by many government intel and military agencies, including the CIA, NSA, ICE, and the IDF. Now, Thiel, like Epstein, was reported to have hosted “debaucherous” sex parties while in his prime that were legendary in gay circles. There are no records implicating Thiel in sexual activity with underage minors or sex trafficking.
Thiel is building his own island in French Polynesia to establish a kind of a non-governmental colony.
A podcaster revealed that Thiel held academic symposiums at Epstein’s Little St. James island. Scientific experiments were reported to have been conducted on some of the young girls Epstein brought there.
Thiel is involved in transhumanism research and development and is alleged to spend $40,000 per quarter on blood transfusions from 18-year olds.
Carbyne is an Israeli tech firm that market themselves as a next-gen 911 platform with software that gives governments and agencies the ability to access individuals’ smartphones, cameras, and mics in real time and integrates biometric tracking, geolocation, and live streaming into a single feed that is sent directly to law enforcement or intelligence hubs. This company was funded by Peter Thiel, Jeffrey Epstein and former Israeli Prime Minister Ehud Barak.
In 2015 and 2016, Epstein put $40 million into two funds managed by Veilar Ventures, a New York City venture capital firm that was co-founded by Peter Thiel!
Vice President JD Vance is a protege of Peter Thiel, who funded Vance’s US Senate campaign.
Brace For Soaring Electricity Bills: Biggest US Power Grid Sets Power Costs At Record High To Feed AI
Very soon if you want AI (and even if you don’t), you won’t be able to afford AC.
Just this morning we warned readers that America’s largest power grid, PJM Interconnect, which serves 65 million people across 13 states and Washington, DC, and more importantly feeds Deep State Central’s Loudoun County, Virginia, also known as ‘Data Center Alley‘ and which is recognized as one of the world’s largest hubs for data centers…
… had recently issued multiple ‘Maximum Generation‘ and ‘Load Management‘ alerts this summer, as the heat pushes power demand to the brink with air conditioners running at full blast across the eastern half of the U.S.
But as anyone who has not lived under a rock knows, the deeper issue is that there’s simply not enough baseload juice to feed the relentless, ravenous growth of power-hungry AI server racks at new data centers.
“There is simply no new capacity to meet new loads,” said Joe Bowring to Bloomberg, president of Monitoring Analytics, which is the independent watchdog for PJM Interconnection. “The solution is to make sure that people who want to build data centers are serious enough about it to bring their own generation.”
Well, there is another solution: crank up prices to the stratosphere.
And that’s precisely what happened. As Bloomberg reports, business and households supplied by the largest US grid will pay $16.1 billion to ensure there is enough electricity supply to meet soaring power demand, especially that from a massive buildout in AI data centers.
The payouts to generators for the year starting June 2026 topped last year’s record $14.7 billion, according to PJM Interconnection LLC, which operates the grid stretching from the Midwest to the mid-Atlantic. That puts the capacity price per megawatt each day at a record $329.17 from $269.92.
In response to the blowout payout, shares of Constellation Energy and Talen Energy surged in late trading in New York on Tuesday.
As millions of Americans will very soon learn the hard way, AI data centers are driving the biggest surge in US electric demand in decades, leading to higher residential utility bills. That’s a key reason why PJM’s auction, once only tracked by power traders and plant owners but now increasingly a topic for general consumption as electricity bills are about to hit an all time high, has also become closely watched by politicians and consumer advocates.
As Bloomberg notes, this is the first auction that included both a price floor and cap, setting the range at $177.24 to $329.17, which of course was the clearing price level reached in this auction. Why even bother pretending there is an auction: just set the price at the max and be done with it. Last year’s 600% jump in capacity prices set off a political firestorm, resulting in PJM reaching a settlement with Pennsylvania Governor Josh Shapiro to essentially cap gains for two years and make auction prices more predictable after wild swings in recent years.
Despite the increase in costs across the grid, the price cap trimmed costs for consumers who saw the biggest hikes in the last auction. Exelon’s Baltimore area utility reached a $466 last time, while Dominion Energy’s Virginia territory came in at about $444.
Payouts to generators stayed at high levels due to surging demand from big data centers coming online swiftly, said Jon Gordon, policy director of non-profit clean energy advocacy Advanced Energy United. New facilities are consuming as much power as towns or small cities, coinciding with a wave of older power plants shutting down and lagging investment in new supplies and grid upgrades, he said.
The per-megawatt price exceeding the 2024 auction, and well closing at an all time high, is bullish for independent power producers including NRG, Talen, Constellation and Vistra, Barclays analyst Nick Campenella had forecast. These generators have spent more than $34 billion so far this year on deals to mainly buy up power plants fueled by natural gas to feed the AI boom especially in PJM.
‘It will destroy this place:’ Tucker County residents fight for future against proposed data center
As a little known company has proposed a data center and natural gas plant in the tourism destination — known for its natural wonder and outdoor recreation — residents are left with questions, mounting concerns and few answers.
A complex of data centers in Ashburn, Va. The city is located in Loudoun County, which has been dubbed “Data Center Alley.” (Gerville | Getty Images)
As a child, Nikki Forrester dreamed of living in a cabin in the woods surrounded by mountains, trees, water and the outdoor opportunities that came with the natural land. In 2022 — four years after earning her graduate degree and moving to Tucker County from Pittsburgh — Forrester and her partner made that dream a reality when they bought two acres of land near Davis, West Virginia to build a home.
Forrester has thrived in the small mountain town known for its mountain biking, hiking, stargazing, waterfalls and natural scenery. She and her partner moved into their new home in February. Hiking and biking trails are right outside her front door. In the winter, she said, snow piles up making the nearby mountains look like “heaven on Earth.”
It’s been quite literally a dream come true.
“I feel like I’ve never felt at home so much before. I love being in the woods. I love this community. It’s super cheesy, but this was my childhood dream and now it’s actually come true,” Forrester said. “It felt so good to set down roots here. We knew Davis was where we wanted to start our future.”
But in March, one small public notice posted in the Parsons Advocate — noticed by resident Pamela Moe, who scrambled to find answers after seeing it — changed Forrester’s assumptions about that future.
A Virginia-based company, Fundamental Data, was applying for an air permit from the West Virginia Department of Environmental Protection for what it called the “Ridgeline Facility.” The company’s heavily redacted application showed plans to build an off-the-grid natural gas power plant between Thomas and Davis. That power plant will likely be designed to power an enormous data center just a mile out from Tucker County’s most populous and tourist-attracting areas.
Earlier this month, representatives for Fundamental Data — who did not respond to requests for comment on this article — told the Wall Street Journal that the facility could be “among the largest data center campuses in the world,” spanning 10,000 acres across Tucker and Grant counties if fully realized.
Now, Forrester said, she and her neighbors are in the middle of what feels like a “fight for [their] lives” as they attempt to learn more about the vague development plans and fight against “big data.”
Her images of the future — skiing on white snow, hiking through waterfalls, looking up at clear and starry nights all with one-of-a-kind mountain scenery below — now exist in the shadows of a looming natural gas plant, an industrial complex and the contaminants that could come with them. The fresh, mountain air that surrounds her home and community could be infiltrated by tons of nitrogen oxide (gases that contribute to smog), carbon monoxide, particulate matter and other volatile organic compounds, per the company’s air permit application.
“Honestly, I feel like if this happens, it will destroy this place. People come here because it’s remote, it’s small, it’s surrounded by nature. If you have a giant power plant coughing up smoke and noise pollution and light pollution, it puts all of those things in jeopardy,” Forrester said. “It would honestly make me question whether I would want to live here anymore, because I do love the landscapes here so much, but they would be fundamentally altered and, I think, irreparably harmed if this actually comes to be.”
Tucker United and a fight against the many ‘unknowns’
Since learning of the project in March, Forrester and dozens of other Tucker County residents have banned together and formed Tucker United. The residents — all volunteers — want answers from Fundamental Data or anyone else regarding details of the proposed Ridgeline facility.
But that fight hasn’t been easy. The state DEP has allowed Fundamental Data — a company with little to no information publicly available — to submit a redacted air permit application, omitting details regarding potential air pollutants that could come from the site.
A heavily redacted page from Fundamental Data’s air permit application to the state Department of Environmental Protection.
According to reporting in Country Roads News, local officials were unaware of the project before reporters and members of the public brought it to their attention.
Reading the Wall Street Journal article was the first time most residents were alerted about the potential size of the planned development.
Josh Nease, who lives outside of Thomas and Davis in an unincorporated part of Tucker County, said the unknowns about the project have been the most frustrating part to grapple with.
“There’s no lack of uncertainty right now, that’s for sure,” said Nease, a sixth generation West Virginian who moved to Tucker County after spending vacations there as a child growing up in Bridgeport. “I think the unknowns here are really worrying.”
If given the chance, he would want to ask representatives of Fundamental Data the following questions: Why the lack of transparency? Why does the company want to locate in Tucker County and why not further out from the towns? And why does it feel like there’s resistance against working with the local governments and community members?
Luanne McGovern, an engineer by trade who owns property in Tucker County and who sits on the board of West Virginia Highlands Conservancy, an environmental nonprofit in the region, holds similar frustrations to Nease.
Per the permit application, the Ridgeline facility — in its currently proposed form — would use gas-fueled turbines with heat recovery steam generators. Diesel would be kept on site in three 10 million gallon storage tanks as a backup power source in case of gas line interruptions. Those tanks would be 66 feet tall and 180 feet in diameter. Leaks from pumps and valves, among other pieces of equipment, are to be expected per the application. Operations for the facility should begin by 2028.
When residents started working together to make sense of Fundamental Data’s air permit application, they asked McGovern to look it over and share her thoughts. Having worked on similar permit requests before, she knew what she was looking at: A large, natural gas power plant.
What was more notable, however, was what she was unable to view.
Pollutants were listed on the request, but only in annual caps. There was no information on water usage despite some data centers using up to 5 million gallons of drinking water a day, straining resources in communities. While the heights of the diesel storage tanks were included, she said information on the turbines wasn’t.
While the DEP asked for clarification on Fundamental Data’s redactions following an influx of public comments from concerned residents, the company said it believed the omitted information met the state’s standard for confidentiality. The DEP ended up agreeing.
Fundamental Data, through its representative Casey Chapman, provided some details to the DEP in an attempt to put the public at ease: the site “does not plan” to use water from local water systems, rivers or streams and won’t discharge wastewater into them; mountains surrounding the development should “substantially limit” its visibility from populated areas and the facility “expects” to operate at noise levels that adhere with federal regulations.
But McGovern still had questions.
“Where is the water coming from? How high are these turbines? Where will they be? If we had some answers to these questions, we could do some modeling and figure out what the potential environmental impact would be, but we don’t,” McGovern said. “We’re just completely in the dark. There’s so many unanswered questions. As an engineer, there’s huge parts of this permit that are just bad. There’s no information provided, not even a level of standard of information that you would expect.”
Nease is realistic; he understands that these are complex issues and the state — as well as his region — are attempting to find new ways to bolster the economy and, hopefully, improve West Virginia’s economic standings long term.
He sees the challenges hitting Tucker County residents every day. There’s a housing shortage and short-term rentals are driving up costs for the places that do exist, pricing out residents who can’t afford to live where they work. While tourism can bring in crowds, it’s often only seasonal. The county’s population — like most of West Virginia — is declining.
“I fully understand the need to diversify the economy. I support doing that, we talk about it all the time. I guess I’m just not sure that a project like this is the solution,” Nease said. “We just don’t know enough about it. We don’t know if this is going to benefit the Tucker County economy. I sure hope it does, but all I have to rely on for that are vague statements.”
‘It feels extractive:’ West Virginia data centers to operate with no local oversight, questionable economic gains
On March 18 — the same day that Fundamental Data submitted its air permit application to the DEP — House Bill 2014 was introduced at the state Legislature to incentivize data centers to locate in West Virginia and generate their own power sources through microgrids. Senate President Randy Smith, a Republican who represents Tucker County and voted for HB 2014, did not respond to requests for comment on this article.
Despite being a key priority for Gov. Patrick Morrisey who requested its introduction, the bill was presented more than halfway through the state’s 60-day session. In back-and-forths over several weeks, lawmakers amended the bill again and again. One change removed a requirement for microgrids to use renewable energy sources, opening the door for coal and natural gas. Several other amendments changed the tax structure for any property taxes collected on the developments.
The version of the bill that now stands as law allows “high impact data centers” to curtail local zoning ordinances and other regulatory processes and establishes a certified microgrid program, which means data centers can produce and use their own power without attaching to already existing utilities.
The law creates a specialized tax structure for data centers and microgrids, which must be placed in designated districts. Local governments have little say or control over those districts, which are established at the state level.
Taxes collected on any data centers and microgrids operating in West Virginia would be split as so: 50% will go to the personal income tax reduction fund, 30% will go to the county where the data center is located, 10% will go to the remaining 54 counties split on a per capita basis using the most recent U.S. Census, 5% will be placed in the Economic Enhancement Grant Fund administered by the Water Development Authority and the final 5% will be put in the newly created Electric Grid Stabilization and Security Fund.
Initially, those taxes were going to be completely diverted away from localities where the data centers would be located, angering county commissioners and other local leaders from throughout the state.
Kelly Allen, executive director of the West Virginia Center on Budget and Policy, said the fact that 50% of any tax revenue collected going to offset the state’s personal income tax cuts is a concern, especially while only 30% will return to localities that host the data centers.
“Local governments are really limited in the ways that they can raise revenue, which is largely controlled by either the state constitution or the state legislature. So taking away a significant slice of one of the only ways that they can raise revenue — through property taxes — leaves [localities] with fewer options to fund basic services,” Allen said. “At the same time, these data centers and micro grids are probably going to increase the need for the public services that local governments pay for.”
Allen pointed to the potential risks that come with operating power plants: county fire and police services will be needed for safety at the plants and water districts may be impacted, she said.
Essentially, she said, counties will be on the hook for funding more services while only receiving a fraction of the revenue generated by the sources of those costs.
And, generally, there’s no guarantee — despite Fundamental Data’s claims for the Tucker County facility — that data centers will serve as massive employers.
Nationwide, according to the U.S. Census, jobs in data centers are increasing. But more than 40% of all jobs in 2023 existed in just three states. Per an analysis by Business Insider, most of the data center jobs available are only in construction and contracted from outside the places the centers are located.
Data centers are largely automated. Microsoft, for example, employs just 50 people per a facility. In West Virginia — because of the inclusion of microgrids, which aren’t mandated to be created for data centers — the picture could look different. But again, the lack of details from companies coming here makes the real impact difficult if not impossible to determine.
Allen said she’s wary of the state’s potential reliance on data centers for a financial boom given the state’s history of extraction-based economics.
Like with the coal economy, residents across the state will bear the aesthetic, environmental and health costs associated with living near data centers and their power plants. Most of the profits, however, may not return to them, Allen said.
“It’s not exactly identical to coal or natural gas or timber, but it feels extractive in the same way in that the benefits of the data center are borne by people outside of West Virginia, while the costs are borne by our residents,” Allen said.
Nease said that while he wants to be “pragmatic” about the potential for development in Tucker County, he can’t help but think of the state’s history in that regard either.
“I’m worried we’re going to fall into that same trap again. It’s an age old story — not just for West Virginia. Some people are going to benefit from this project, they just might not be here,” Nease said. “The company will benefit, its [shareholders] will. But will we?”
‘A race to the bottom:’ While West Virginia lawmakers want to compete with Virginia, locals say it’s not possible
While state lawmakers spent hours this legislative session debating how to craft the state’s new law to attract data centers, several couldn’t stop thinking about — or mentioning — neighboring Virginia, where the development of large, high-impact data centers have boomed.
Echoing sentiments shared by Morrisey through his “Backyard Brawl” plan to compete with neighboring states economically, delegates — including Del. Clay Riley, R-Harrison, who sits on the House Committee on Energy and Public Works, where the bill passed — said they wanted to see data center development here thrive like it has in Northern Virginia.
Loudoun County, Virginia has been dubbed “Data Center Alley.” It’s home to the largest data center market in the world.
But that development didn’t happen overnight, said Julie Bolthouse, director of land use at Piedmont Environmental Council in Virginia.
The industry started building in Northern Virginia in the 1990s and 2000s. Some of the largest data and internet providers at the time were located there. Over time, though, the market has changed.
Bolthouse said what used to be small complexes organized like business parks — featuring restaurants, shopping, day cares and more for people who lived in the region — are now large campuses with few people, no outside amenities and mostly computers and software.
And those “hyper-scaled” complexes — in Virginia and beyond — haven’t come without costs. The pollutants emitted by large centers are known to exacerbate respiratory problems and other health conditions. Residents nearby can hear the incessant buzzing and hums of the computers and generators at work. Light pollution, depending on the size and type of facility, can be impossible to ignore.
But these issues — outside of the environmental ones — vary place to place because of local ordinances.
“That is like the only thing that’s really protecting Virginia communities, because the only way that the people who live in these localities are able to get any kind of protection is because of noise ordinances, because of the lighting ordinances,” Bolthouse said.
In West Virginia under HB 2014, residents won’t have the same protections or powers due to the state’s superseding of local ordinances.
And now, decades into Virginia’s ever changing data center sector, Bolthouse and other environmentalists are seeking more regulations on the state level since the nature of these data centers has changed so much over such a short period of time.
“That’s the push we’re seeing now — for the state to come in and add additional regulations, to look at the environmental impact,” Bolthouse said. “No one is talking about taking away the ability of localities to regulate these facilities. I can’t imagine that.”
And while the landscape for data centers is evolving in Loudoun County and beyond, the reason so many large companies have decided to locate their centers in Northern Virginia goes back to the 1990s. The infrastructure for them to be developed, Bolthouse said, already existed — it wasn’t newly created like West Virginia is attempting to do.
“There’s such a robust fiber network here. These data centers are kind of like a gigantic global computer. They talk to each other, and so the closer they are to all the other cloud providers, the better,” Bolthouse said. “When you put a data center here, your data is stored in Northern Virginia and you are in spitting distance to [Amazon], Google, Microsoft, all the big co-locators … probably every big business has an operation here in Northern Virginia. So it’s like the Wall Street of the data center industry. That’s why they want to locate here.”
Bolthouse warned that without regulations, without protections and without the advantages that Virginia has through its location and infrastructure, West Virginia could be attempting to enter a new sector by inviting in the “worst players.”
“What you’re going to get if you do it this way is the worst players, the ones that didn’t need to be in Northern Virginia … the players that are wanting that lack of regulations because they didn’t want to abide by rules and didn’t want to or need to protect communities, which is worse for West Virginia and the communities,” Bolthouse said. “What West Virginia is doing is not what Virginia is doing.”
She said West Virginia needs to look at the assets it already has, not the assets others in the sector have worked with for decades.
Those assets, in Bolthouse’s words, are the same things that made Forrester feel like her childhood dreams were coming true when she built a home in Tucker County: the state’s “beautiful mountains, its rivers, its natural beauty and outdoor opportunities.”
“That’s what West Virginia should be leveraging. The state shouldn’t be trying to get something that another state has already secured the market on,” Bolthouse said. “I don’t know that West Virginia can become the next Data Center Alley. I don’t think that’s actually feasible … You’re trying to basically have a race to the bottom, and you’re only going to get the worst players.”