One can easily be taken in by the algorithmic “echo chamber” that your learned preferences present. Everyone thinks just like you!

Artificial Intelligence

Here is an interesting article on the irrational terror of AI destroying something important:

https://futurism.com/the-byte/amazon-flooded-books-written-by-ai

A while back, I used a method to decide if something was true that I read: if it was corroborated by other writers, it was more likely to be true. It was a thing where different blogs (just reacting to a demand for content) would simply copy postings to perhaps add to an idea's supposed veracity. A person had to discount convenient copies of basically the same blog post broadcast more widely. I think this was something of a fad and no one sees much of this anymore because people caught onto the trick. It isn't done much now that blogs are just one of dozens of publishing vehicles and no longer dominant so less interesting to thought promoters. I really have to work more with inspiration from God to work to find truth - the structures of man just don't cut it anymore.

As you read in the article linked above, artificial intelligence or AI (a misnomer if ever there was one) is being used by some publishers to sample numerous sources on the internet and generate content based on what was sampled. This process is very much faster than human writers and editors, who often don't have the time or resources to compete in producing schlock. As a result, text venues are flooded with derivative content that just combines already extant information using a programmed writing style. Sadly, that was the occupation of a number of people who may just need to find new ways to make money.

This situation gets derided by writers, but the sad fact is that this low-grade fact-reassembly was often all that human writers were doing, just less efficiently. For example, I can see in the future that there will be a flood of formulaic mystery and romance novels and then their death as fiction genres once people see these forms easily generated algorithmic-ally and quickly, thus pushing out human authors. Increasingly, anything that employed some “get rich quick” sort of scheme will first be subverted by AI, then die for its better acquired cheapness. AI will eliminate a lot of low-value, over-compensated muck (like this very blog post) out of the stream.

The days of stock traders and mid-level managers may also be numbered by AI. Business administration itself will become an app and jobs will coalesce at the top and the bottom: those who tell the app what to do and those who do what the app instructs. Everything in between that was done by a person will be effectively done going forward by a relay.

As a counterpoint to all of this, I actually have great faith that entropy will invade, as it always does. I often wonder how scientific and engineering types are so enamored of the capacity of their perfect creations. That is what allows the “echo chamber” effect to enter into the worldview. Remember the “Skynet” of Terminator movie fame? Only sci-fi fantasies upon fantasies produced that synthetic dystopia where machines eliminated those inefficient biologicals and made the scenario seem plausible. I hear podcasts gin up fear because researchers swear that AI is “very close” to sentient. It is an example of too many engineers talking only to other engineers about the amazing things they are producing in predictive chess software - a classic “echo chamber”. People aren't just chess pieces that must only use their regulated moves. People will become even more irrational than normal to become un-systemic and un-manageable to some AI over-lord - it wouldn't even be hard to do. Many people would view it as a service to humanity and treat it like a game.

Like computers, AI will never be able to work reliably enough of itself to do anything on a catastrophic scale - an army of technicians is still required to keep everything working at all and it doesn't even take a very competent “bad actor” to make things break and do so badly. The “best minds” can't keep rural schools safe from solo disgruntled teenagers, how can they manage the building of a product that sane people would trust to rule their lives? AI can't hope to adequately simulate the reality of the chaos we humans cause and cope with day-to-day!

Imagine further the case of an AI feeding on the output of another AI? We humans are weird about identifying if something is from a “real” person or a “synthetic” simulation, but what if an AI can't figure that out? Do they fall into an endless loop of generative feedback? It sounds like things would get bizarre quickly and obviously stop working in the way intended. It's like the old “telephone” game, passing regenerated stories that can't be a too-faithful copy that ends up completely unrelated to the original message. How do you teach an algorithm to manage its own information feedback echo and still seem human? How well do humans manage this? Poorly.

All computing constitutes a huge number of programmed yet thoughtless relays. The presumption that this is equivalent to the human contribution is silly - we aren't just computers and our worst sin is to treat each other as if we are just fleshy relays. AI is just profitable dehumanization that can be (easily?) overcome by seeing it for what it really is and not allowing it to control what we think and do individually.

In the end, AI just employs algorithmic tricks to simulate the baser activities of human writers (as one example) and do this faster and cheaper. All of the fear-mongering comes because this is exactly the trickery of our largely undeserving “betters” in some many realms. Our 'thought-leaders' play up the AI threat because it exposes them for the formulaic hucksterism that often passes for “leadership” in their vaunted towers.
Topic revision: r2 - 16 Apr 2024, JasonNemrow
This site is powered by FoswikiCopyright © by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding !QuIXWiki? Send feedback