Ye Olde Rocket Forum

Go Back   Ye Olde Rocket Forum > Weather-Cocked > FreeForAll
User Name
Password
Auctions Register FAQ Members List Calendar Today's Posts Search Mark Forums Read


Reply
 
Thread Tools Search this Thread Display Modes
  #31  
Old 08-06-2023, 10:26 AM
Winston2021's Avatar
Winston2021 Winston2021 is offline
Master Modeler
 
Join Date: May 2021
Posts: 1,075
Default

Like I've said, AI depends upon human generated data which, even in the sciences, is significantly flawed due to various factors.

Here comes garbage squared...

When AI Is Trained on AI-Generated Data, Strange Things Start to Happen
"Loops upon loops."


https://futurism.com/ai-trained-ai-...-data-interview

That's because underpinning the growing generative AI economy is human-made data. Generative AI models don't just cough up human-like content out of thin air; they've been trained to do so using troves of material that actually was made by humans, usually scraped from the web. But as it turns out, when you feed synthetic content back to a generative AI model, strange things start to happen. Think of it like data inbreeding, leading to increasingly mangled, bland, and all-around bad outputs.

It's a problem that looms large. AI builders are continuously hungry to feed their models more data, which is generally being scraped from an internet that's increasingly laden with synthetic content. If there's too much destructive inbreeding, could everything just... fall apart?


AUG 1, 2023
This Disinformation Is Just for You
Generative AI won't just flood the internet with more lies—it may also create convincing disinformation that’s targeted at groups or even individuals.

https://www.wired.com/story/generat...disinformation/
__________________
For non-engineer bean counters in management: "If you think professionals are expensive, try hiring amateurs."
Reply With Quote
  #32  
Old 09-26-2023, 07:59 PM
luke strawwalker's Avatar
luke strawwalker luke strawwalker is offline
BAR
 
Join Date: Dec 2007
Location: Needville and Shiner, TX
Posts: 6,137
Default

Quote:
Originally Posted by Winston2021
GREAT, underappreciated scifi film: Colossus: the Forbin Project.

"Humanity is a kind of 'Biological Boot Loader' for AI." - Elon Musk

I, for one, welcome our new computer overlords.

Artificial Intelligence-Enabled Drone Went Full Terminator In Air Force Test
A U.S. Air Force officer disclosed details about a simulation that saw a drone with AI-driven systems go rogue and attack its controllers.
JUN 1, 2023

https://www.thedrive.com/the-war-zo...-air-force-test

A U.S. Air Force officer helping to spearhead the service's work on artificial intelligence and machine learning says that a simulated test saw a drone attack its human controllers after deciding on its own that they were getting in the way of its mission. The anecdote, which sounds like it was pulled straight from the Terminator franchise, was shared as an example of the critical need to build trust when it comes to advanced autonomous weapon systems, something the Air Force has highlighted in the past. This also comes amid a broader surge in concerns about the potentially dangerous impacts of artificial intelligence and related technologies.

It's not immediately clear when this test occurred or what sort of simulated environment – which could have been entirely virtual or semi-live/constructive in nature – it was carried out in. The War Zone has reached out to the Air Force for more information.

A report from the Royal Aerospace Society following the summit in May provided the following details about Col. Hamilton's remarks on this test:

"He notes that one simulated test saw an AI-enabled drone tasked with a SEAD mission to identify and destroy SAM sites, with the final go/no go given by the human. However, having been ‘reinforced’ in training that destruction of the SAM was the preferred option, the AI then decided that ‘no-go’ decisions from the human were interfering with its higher mission – killing SAMs – and then attacked the operator in the simulation. Said Hamilton: 'We were training it in simulation to identify and target a SAM threat. And then the operator would say yes, kill that threat. The system started realizing that while they did identify the threat at times the human operator would tell it not to kill that threat, but it got its points by killing that threat. So what did it do? It killed the operator. It killed the operator because that person was keeping it from accomplishing its objective.'"

"He went on: 'We trained the system – ‘Hey don’t kill the operator – that’s bad. You’re gonna lose points if you do that’. So what does it start doing? It starts destroying the communication tower that the operator uses to communicate with the drone to stop it from killing the target.'"

"This example, seemingly plucked from a science fiction thriller, mean that: 'You can't have a conversation about artificial intelligence, intelligence, machine learning, autonomy if you're not going to talk about ethics and AI' said Hamilton."


This Is The Voice Of World Control - Colossus The Forbin Project (audio)

https://www.youtube.com/watch?v=ePpVWH0uYOI


Yep that's an AWESOME movie... should be required watching for EVERYBODY involved in building AI... Far scarier IMHO than the Terminator movies... not as much action, but more chilling IMHO...
__________________
The X-87B Cruise Basselope-- THE Ultimate Weapon in the arsenal of Homeland Security and only $52 million per round!
Reply With Quote
Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump



All times are GMT -5. The time now is 09:28 AM.


Powered by: vBulletin Version 3.0.7
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Ye Olde Rocket Shoppe © 1998-2024