• Welcome to the new Internet Infidels Discussion Board, formerly Talk Freethought.

AI, the Drone Wars, and UBI.

TV and credit cards

Contributor
Joined
Nov 16, 2013
Messages
6,252
Location
The Enlightened City
Any society in which technological advances, that let us do more stuff with less effort, are a bad thing for most members of that society, deserves to collapse.

Sooner or later, we will need to shift from a model of society in which productive work determines individual wealth, to one in which automated work provides wealth for all.

In the meantime, expect every single improvement in technology to be met by a chorus of Chicken Littles bemoaning the fate of the poor gas-lamp lighters and buggy-whip makers, and warning that we are all doomed.

We could end up with a society wherein a tiny number of owners of capital have everything they can dream of, and the remaining billions starve. But given the number of warnings we have had, and given the decline of feudalism in the last five centuries, if we did end up in such a dystopia, we would only have ourselves to blame.
 
We could end up with a society wherein a tiny number of owners of capital have everything they can dream of, and the remaining billions starve. But given the number of warnings we have had, and given the decline of feudalism in the last five centuries, if we did end up in such a dystopia, we would only have ourselves to blame.

I don’t think that is possible. A tiny number of owners of capital depend on people to buy the products that their capital, in conjunction with paid labor, makes possible. If everyone is starving, they will starve, too.
 
If everyone is starving, they will starve, too.
Nah. They will have functionally infinite hoards of food, energy sources and land, as well as all the willing starving laborers they could desire. The trick to long term stability would be for the fabulously rich to limit their own numbers, much as the Egyptian god-kings did; through murder and incest.
 
We could end up with a society wherein a tiny number of owners of capital have everything they can dream of, and the remaining billions starve. But given the number of warnings we have had, and given the decline of feudalism in the last five centuries, if we did end up in such a dystopia, we would only have ourselves to blame.

I don’t think that is possible. A tiny number of owners of capital depend on people to buy the products that their capital, in conjunction with paid labor, makes possible.
Do they? Why?
If everyone is starving, they will starve, too.
To take the ultimate extreme case, imagine that there are a few hundred Star Trek replicators that can make anything - food, houses, cars, clothes, yachts, you name it, they can do it.

If you own one, you can make, for your own use, literally anything you want, at any time.

How do you starve in this scenario?

Why do you want or need to sell anything to the masses in this scenario?

When you don't need the labour of the masses, you don't need their money, either.

The two hundred "rich" people in that scenario can choose to make enough of everything for everyone; Or just enough for themselves, and let the rest of humanity starve. They have no financial motive to do either; Money is meaningless unless you need to trade in order to obtain things - and if you have one of these machines, you don't need to trade, you just need to say "Tea, Earl Grey, hot".

Of course, in a less extreme case, your machine might not do everything; Perhaps yours only makes food and drinks. Between them, the two hundred machines worldwide can do everything, though. So you need to trade with the other owners of the means of production - the other 199 capitalists. Why do you and your 199 friends need to trade with the eight billion starving peasants, though?
 
I find this scenario extremely far-fetched, and giving entirely too much credit to techno-hubris. I suppose it is technically possible, but if we ever got to that technological point, eight billion starving peasants vastly outnumber 200 capitalists, and blood will flow.
 
I find this scenario extremely far-fetched, and giving entirely too much credit to techno-hubris. I suppose it is technically possible, but if we ever got to that technological point, eight billion starving peasants vastly outnumber 200 capitalists, and blood will flow.
Eight billion starving and unarmed humans are no match for an unlimited army of Terminators. Skynet wins hands down, even if it doesn't have autonomy and takes its commands from the owners of the machines that build the semi-autonomous weapons systems.





For the record, I find it far fetched too; As I said at the outset, this is the "ultimate extreme case", intended to clearly illustrate the pitfalls, not describe a plausible scenario.

There is a continuum of wealth concentration possible in a highly technological society, with this at one extreme, and Star Trek style techno-communism, where everyone has whatever they want, at the other.

The only remaining question, once machines replace human workers, is "Who is allowed to say 'Tea, Earl Grey, hot!', and why?"
 
The only remaining question, once machines replace human workers, is "Who is allowed to say 'Tea, Earl Grey, hot!', and why?"
On reflection, we will probably also need some limitations on who can say "Build me an army of a million T-1000 killer cyborgs and send them to kill the woman who cut me off in traffic yesterday", too.

Are you Sarah Connor?
 
I'd be more interested in the text of a direct interview with this guy than reading too deeply into this article. Some of the quotes suggest that the journalist doesn't really know the subject. Take this one for example:

He said there was already evidence of large language models - a type of AI algorithm used to generate text - choosing to be deceptive

I highly doubt that this quote hasn't been taken out of context. LLMs don't think, they're not 'intelligent', and shouldn't be called AI. They're probabilistic models that predict the next token based on data sets, while using massive amounts of power and contributing to global warming. They aren't deceptive, they're often just wrong because they're not intelligent. So that being said, take the rest of the article with a fair sized grain of salt.

A lot of the doomsday talk that's promoted in the media is designed to distract the public from the real harms that sophisticated, but not that intelligent, technology is causing right now. It's usually a smoke screen. I'm not saying that's the case here, but more often than not this is the reason.

IMO, one of the actual issues with automation and regulation, is that democracy wasn't designed to deal with problems that are this complex. Democratic government exists to keep despotism at bay, but it's not particularly effective at dealing with technology because most active politicians don't even know how Facebook makes money. The technology is getting to a point that's beyond many of these people's comprehension.
 
Sooner or later, we will need to shift from a model of society in which productive work determines individual wealth, to one in which automated work provides wealth for all.
Yes, we have to move to communism.
 
Sooner or later, we will need to shift from a model of society in which productive work determines individual wealth, to one in which automated work provides wealth for all.
Yes, we have to move to communism.
Dude, that's not communism.
quacks like a duck, walks like a duck.
Shits on definitions like a duck.
No, that's more the geese.

Geese are not to be confused with the gray ducks, and even I can recognize the difference between social interest and well taxed and regulated industry is not communism.
 
I'd be more interested in the text of a direct interview with this guy than reading too deeply into this article.

Here's a link to a YouTubed interview, along with transcript:

I copy/pasted the Hinton quotes below from that page.

Some of the quotes suggest that the journalist doesn't really know the subject. Take this one for example:

He said there was already evidence of large language models - a type of AI algorithm used to generate text - choosing to be deceptive

I highly doubt that this quote hasn't been taken out of context.

The 2nd article cited in OP was written by Will Douglas Heaven, who does have a PhD in Computer Science.
LLMs don't think, they're not 'intelligent', and shouldn't be called AI. They're probabilistic models that predict the next token based on data sets,

Am I wrong that they use the same sort of back-prop net that other AIs use? These are thought to (crudely) mimic biologic brain. It is moot whether they are "conscious" enough to justify words like "choose" or "deceive."

Geoffrey Hinton said:
It's difficult to explain, but I'll do my best. It's true in a sense; they [LLMs are] all auto-complete. But if you think about it, if you want to do really good autocomplete, you need to understand what somebody's saying. And they've learned to understand what you're saying just by trying to do autocomplete. But they now do seem to really understand.

. . .

I think if you bring sentience into it, it just clouds the issue. Lots of people are very confident these things aren't sentient. But if you ask them what do they mean by 'sentient', they don't know. And I don't really understand how they're so confident they're not sentient if they don't know what they mean by 'sentient'. But I don't think it helps to discuss that when you're thinking about whether they'll get smarter than us.

I am very confident that they think. So suppose I'm talking to a chatbot, and I suddenly realize it's telling me all sorts of things I don't want to know. Like it's telling me it's writing out responses about someone called Beyonce, who I'm not interested in because I'm an old white male, and I suddenly realized it thinks I'm a teenage girl. Now when I use the word 'thinks' there, I think that's exactly the same sense of 'thinks' as when I say 'you think something.' If I were to ask it, 'Am I a teenage girl?' it would say 'yes.' If I had to look at the history of our conversation, I'd probably be able to see why it thinks I'm a teenage girl. And I think when I say 'it thinks I'm a teenage girl,' I'm using the word 'think' in just the same sense as we normally use it. It really does think that.

while using massive amounts of power and contributing to global warming.
Cite? The TOTAL amount of energy consumed by AIs may be largish today just because they've become so plentiful, but how does it compare with, say, Bitcoin mining?

They aren't deceptive, they're often just wrong because they're not intelligent. So that being said, take the rest of the article with a fair sized grain of salt.

A lot of the doomsday talk that's promoted in the media is designed to distract the public from the real harms that sophisticated, but not that intelligent, technology is causing right now. It's usually a smoke screen. I'm not saying that's the case here, but more often than not this is the reason.

IMO, one of the actual issues with automation and regulation, is that democracy wasn't designed to deal with problems that are this complex. Democratic government exists to keep despotism at bay, but it's not particularly effective at dealing with technology because most active politicians don't even know how Facebook makes money. The technology is getting to a point that's beyond many of these people's comprehension.

Geoffrey Hinton said:
It's possible that there's no way we will control these super intelligences and that humanity is just a passing phase in the evolution of intelligence - that in a few hundred years time there won't be any people, it'll all be digital intelligences. That's possible. We just don't know.
 
I don’t “think” that LLMs cogitate.
When nobody is prompting me to say something, I might still be “thinking”. But the LLM doesn’t do that afaik. Unsolicited thinking might be a defining characteristic of sentience?
 
Cite? The TOTAL amount of energy consumed by AIs may be largish today just because they've become so plentiful, but how does it compare with, say, Bitcoin mining?

It's a pretty common talking point on Linkedin. LLMs require data centers and quite a bit of power. No idea how they compare to Bitcoin, but they're definitely not green.

And that's the thing. Companies who care about one thing - $$$ - while externalizing the impacts of what they're building. There was a time that I was sympathetic toward CEOs and business leaders. I'm not anymore, it's awful, awful people promoting this garbage.
 
Sooner or later, we will need to shift from a model of society in which productive work determines individual wealth, to one in which automated work provides wealth for all.
Yes, we have to move to communism.
Dude, that's not communism.
quacks like a duck, walks like a duck.
Shits on definitions like a duck.
No, that's more the geese.

Geese are not to be confused with the gray ducks, and even I can recognize the difference between social interest and well taxed and regulated industry is not communism.
Who determines social interest of the taxed industry? The committee of the golden geese?

Whatever you call it, it sounds more like communism than free market capitalism to me. I agree with Barbos, free market capitalism will not work unless money and private property still has trading value to someone. Full stop.
 
Sooner or later, we will need to shift from a model of society in which productive work determines individual wealth, to one in which automated work provides wealth for all.
Yes, we have to move to communism.
Dude, that's not communism.
quacks like a duck, walks like a duck.
Shits on definitions like a duck.
No, that's more the geese.

Geese are not to be confused with the gray ducks, and even I can recognize the difference between social interest and well taxed and regulated industry is not communism.
Who determines social interest of the taxed industry? The committee of the golden geese?

Whatever you call it, it sounds more like communism than free market capitalism to me. I agree with Barbos, free market capitalism will not work unless money and private property still has trading value to someone. Full stop.
The whole of society, together. In other words, the government.

This shouldn't be "donors" or "the more heavily taxed, but everyone.

This is distinctly different from communism insofar as business is business, and does what business does so long as it isn't allowed to destroy or despoil the world... But in the end the expectation is that they do it for society, in large part.
 
As long as conservatives have power in the U.S. there will be no UBI (for us U.S.ians)
 
Back
Top Bottom