Follow TV Tropes

Following

Artificial Intelligence discussion

Go To

With how much artificial intelligence has been improving, in many areas such as text reading/generation, picture reading, picture generation, convincing voice synthesis and more, I think there's a lot that can be discussed, about the effects that this technology will have on society.

I'll start off with one example.

I'd been thinking about the enshittification cycle of tech, and I think it's coming for Google hard. The search engine just isn't so great at finding what you actually want, and I think that's gonna leave a big opening for Bing with their use of AI. If the AI can sift through the crap and actually find what you want for real, due to its understanding of language, it'll actually make searching super useful again.

In the pre-Google internet, search engines used to search only for exact words and phrases, which had its uses, but also meant finding a lot of sites that simply crammed in a lot of popular words and phrases to get visitors. Google cut through the crap with a better understanding of how to "rank" sites relative to how relevant they are, and even find sites that are on the topic you were looking for without using the same exact words.

But Google started to become more advertiser-friendly, then later, more shareholder-friendly. There's a limit to how much one can make their product built entirely around shareholder growth, so as it turns to crap, it leaves an opening for a competitor to show up.

Since Bing/ChatGPT (which Bing is plugged into now) understands the use of language, it can actually understand context and determine relevance based on that. And that'll make it huge, I think. Context-based understanding of web pages can potentially do an excellent job of finding what people actually want, in a way that goes way beyond Google's page ranking systems, or the examination of exact words.

Edited by BonsaiForest on Dec 10th 2023 at 6:15:29 AM

TobiasDrake Queen of Good Things, Honest (Edited uphill both ways) Relationship Status: Arm chopping is not a love language!
Queen of Good Things, Honest
#101: Jan 18th 2024 at 11:14:03 AM

I see AI as like any other tool that dramatically changes people's lives - capable of both good and evil, and likely to be used for both.

"Dramatically changes people's lives" might be overstating the impact of a slightly more complicated AutoComplete.

Edited by TobiasDrake on Jan 18th 2024 at 11:15:06 AM

My Tumblr. Currently liveblogging Haruhi Suzumiya and revisiting Danganronpa V3.
BonsaiForest Since: Jan, 2001
#102: Jan 18th 2024 at 11:18:53 AM

For now. It's in its infancy. And it's more than just chat, since it's being used to generate pictures. It's only going to become more capable and more powerful. I sometimes use it to ask complicated questions that would normally require a large number of Google searches to gather all the needed info rather than it being in one spot. (I pay for GPT-4, which is smarter and more capable than the free version.)

Zendervai Visiting from the Hoag Galaxy from St. Catharines Since: Oct, 2009 Relationship Status: Wishing you were here
Visiting from the Hoag Galaxy
#103: Jan 18th 2024 at 11:22:17 AM

[up] Ugh, that really makes me cringe. If you're trying to learn about something new, that means you might not be able to tell if it's hallucinating something or not, and it's still averaging at about 15% hallucination, as far as I can tell.

ChatGPT under no circumstances should be used as a source of information, it's way too unreliable and you have to fact check everything it says unless you want to end up with a wildly wrong belief.

Not Three Laws compliant.
BonsaiForest Since: Jan, 2001
#104: Jan 18th 2024 at 11:24:34 AM

True, but when I looked up to see if it was right about those things, it always was.

The tech is flawed but improving. I'm interested to see where it'll be ten years from now.

Smeagol17 (4 Score & 7 Years Ago)
#105: Jan 18th 2024 at 11:27:57 AM

At least when I used GPT-3, when I asked it questions that I 'wasn't' able to just look up, it was always (obviously) wrong.

BonsaiForest Since: Jan, 2001
#106: Jan 18th 2024 at 11:37:59 AM

I don't even use GPT-3 anymore, and haven't since paying for the monthly fee for GPT-4.

I can tell you that GPT-4 is noticeably smarter than GPT-3. Not that it doesn't have flaws and need work, but again, this tech is in its infancy. Like, future generations will look back on GPT-3 and think, "People were impressed by that?!"

generation81 Since: Aug, 2021
#107: Jan 18th 2024 at 12:08:42 PM

As for why I hate AI, I feel like it takes jobs and is more often used by bigwig execs to avoid havimng to pay workers amd lose their wealth.

Even when its merely used as a tool I'm wary of it.

BonsaiForest Since: Jan, 2001
#108: Jan 18th 2024 at 12:24:35 PM

Executives are indeed greedy, and whenever employers can save money, they always do. AI is simply, in that respect, the latest tool for this.

The internet killed a lot of physical magazines and newspapers, shifting jobs there. Movie theaters resulted in fewer people going to plays. Cars made passenger trains unprofitable in the US (the government, via Amtrak, keeps them running so that infrastructure is still around). It's just life.

On the other hand, I'm more inclined to agree with concerns of plagiarism based on how AI obtains its knowledge (of pictures or writing - I had it both analyze and then later recite in full, a Robert Frost poem when demonstrating it to a coworker). As well as concerns that it's being shoved into things that it isn't good enough to do yet - I'm seeing that all over the place and I find it ridiculous. I do think this will be major technology that changes things a lot, once it actually gets there, which it isn't yet. Currently it's a tech demo for the most part.

When it kills jobs, which it undoubtedly will, the question is what new jobs will arise, where and how, how much will they pay, etc. I can't speak for other countries, but here in the US, our government reallly hates helping people in need.


Relatedly, there's this article talking about what people at companies are saying about their companies' use of AI tools.

The survey also found that different studio departments showed different levels of willingness to embrace AI tools. Forty-four percent of employees in business and finance said they were using AI tools, for instance, compared to just 16 percent in visual arts and 13 percent in "narrative/writing."

Narrative/writing has a much lower use of AI tools than other industries. I can easily imagine why. Humans write stories far better and more imaginatively than AI does!!

Developers cited coding assistance, content creation efficiency, and the automation of repetitive tasks as the primary uses for AI tools, according to the report.

“I’d like to see AI tools that help with the current workflows and empower individual artists with their own work," one anonymous respondent wrote. "What I don’t want to see is a conglomerate of artists being enveloped in an AI that just does 99% of the work a creative is supposed to do.”

Coding assistance, creation efficiency and automation of repetitive tasks. Augmenting people's creativity and helping speed up processes seems to be what they're excited about, and that makes sense.

Edited by BonsaiForest on Jan 18th 2024 at 3:45:44 PM

lu127 Paper Master from 異界 Since: Sep, 2011 Relationship Status: Crazy Cat Lady
#109: Jan 18th 2024 at 2:18:51 PM

Chat GPT under no circumstances should be used as a source of information, it's way too unreliable and you have to fact check everything it says unless you want to end up with a wildly wrong belief.

GPT 4 is reliable enough to write a literature review on its own with supervision from a sufficient expert. Engineering academics love it because it takes away the tedious stuff (literature reviews, abstracts) and lets them do the part they care about, i.e. the output.

"If you aren't him, then you apparently got your brain from the same discount retailer, so..." - Fighteer
Zendervai Visiting from the Hoag Galaxy from St. Catharines Since: Oct, 2009 Relationship Status: Wishing you were here
Visiting from the Hoag Galaxy
#110: Jan 18th 2024 at 2:27:11 PM

With supervision, yes. Not as its own thing being operated by someone not familiar with the subject they’re asking about.

Not Three Laws compliant.
Silasw A procrastination in of itself from A handcart to hell (4 Score & 7 Years Ago) Relationship Status: And they all lived happily ever after <3
A procrastination in of itself
#111: Jan 18th 2024 at 2:27:51 PM

If it’s being supervised by an expert how is it doing it on its own? That seems like a contradiction in terms.

“And the Bunny nails it!” ~ Gabrael “If the UN can get through a day without everyone strangling everyone else so can we.” ~ Cyran
lu127 Paper Master from 異界 Since: Sep, 2011 Relationship Status: Crazy Cat Lady
#112: Jan 18th 2024 at 2:35:16 PM

"Supervised" in the sense that the prompts are made by someone familiar with the topic and with enough understanding to be able to fact-check and spot horrible/made-up references. And the point is that the expert looks for incorrect referencing/fallacies (because the expectation is there will be some) and can't find any. You have to actively try to make GPT 4 make up a reference, while 3.5 easily does it. There is world of difference between the two.

Edited by lu127 on Jan 18th 2024 at 12:36:25 PM

"If you aren't him, then you apparently got your brain from the same discount retailer, so..." - Fighteer
RainehDaze Figure of Hourai from Scotland (Ten years in the joint) Relationship Status: Serial head-patter
Figure of Hourai
#113: Jan 18th 2024 at 2:41:56 PM

"Dramatically changes people's lives" might be overstating the impact of a slightly more complicated Auto Complete.

I mean, AI, unqualified, does not automatically mean 'generative AI, specifically LLVMs', and on the whole (especially with classification and recognition tasks), there's a lot it can do. Particularly on the accessibility front.

Avatar Source
Protagonist506 from Oregon Since: Dec, 2013 Relationship Status: Chocolate!
#114: Jan 18th 2024 at 2:47:05 PM

I believe AI has made significant advanced in medical research as well, but unsure of the details regarding that.

"Any campaign world where an orc samurai can leap off a landcruiser to fight a herd of Bulbasaurs will always have my vote of confidence"
Florien The They who said it from statistically, slightly right behind you. Since: Aug, 2019
The They who said it
#115: Jan 18th 2024 at 3:50:30 PM

[up] it's very good at doing protein folding problems, which were a big hurdle before.

Zendervai Visiting from the Hoag Galaxy from St. Catharines Since: Oct, 2009 Relationship Status: Wishing you were here
Visiting from the Hoag Galaxy
#116: Jan 18th 2024 at 3:59:55 PM

It is very important though to actually incorporate the group an AI product is being made for.

Like, that AI sign language interpreter thing that went viral? Turns out it's complete garbage, half of what it puts out is near gibberish, it's really hard to train because sign languages are all mutually incomprehensible and have a huge amount of overlapping signs that mean radically different things, and also sign language doesn't tend to be word for word translation, it's called interpreting for a reason.

Sign language is a distinct language, it's not just "English you use your hands for".

It turns out that people have been trying to make automatic sign language translators for decades and literally all of them have failed because people who need sign language always get ignored, because people don't respect the many different forms of sign language as distinct languages. Also, deaf people aren't very common in techbro spaces because a lot of techbros are disgustingly ableist.

Edited by Zendervai on Jan 18th 2024 at 7:02:08 AM

Not Three Laws compliant.
Imca (Veteran)
#117: Jan 19th 2024 at 12:10:17 AM

Even when its merely used as a tool I'm wary of it.

You have been using it since the 90s, or if you are younger then that your entire life.

People only really care about it now, but AI is fundemtal in how the internet works.

How do you think google knows what your looking for? How youtube knows what your looking for?

There is a reason Machine learning as a feild has been around since the 70s... feilds that produce nothing dont get funding.

What's new is that GPT sparked a media firestorm, and every one started to care

I believe AI has made significant advanced in medical research as well, but unsure of the details regarding that.

Protien folding and chemistry.

The covid vaccine for instance was designed with heavy AI assistance, that's part of why it hit production in only a year.

Edited by Imca on Jan 20th 2024 at 5:14:50 AM

Tremmor19 reconsidering from bunker in the everglades Since: Dec, 2018 Relationship Status: Too sexy for my shirt
reconsidering
#118: Jan 19th 2024 at 12:17:21 AM

[up][up] now im curious what exactly doesn't work about it. you mentioned language variations— translations too rigid? Maybe the facial expressions? i dont know sign language but my understanding is that its not purely hand movements, its also tone and body/face movements a lot. Wonder if video would be a better medium than gloves? Would love to see how someone who does speak fluent sign language would approach it

[down] ah, right, i forgot they were trying to do real time as well. interesting, thanks

Edited by Tremmor19 on Jan 19th 2024 at 3:57:59 PM

Imca (Veteran)
#119: Jan 19th 2024 at 12:28:20 AM

Speaking as some one who has done interpretation work (vocal admitantly not sign)

Any thing pitching real time interpretation is always going to fail, we have to wait until the other person is done speaking not just because it would be hella rude to talk over them... but also because languages do not fundamentally share the same structure.

Verbs may be up front in one language, in the back of another... sentance subjects may be optional or they may not... your language might be heavily contextual (please dont ask us to translate one line of Japanese, without the full context your getting something very half assed) and so on.

Sign being a visual medium instead of an auditory one I would imagine has these same issues but worse, before you throw in that say Amercian Sign Language and Australian Sign Lanuguage are compleatly diffrent languages even though both nations speak english.

Zendervai Visiting from the Hoag Galaxy from St. Catharines Since: Oct, 2009 Relationship Status: Wishing you were here
Visiting from the Hoag Galaxy
#120: Jan 19th 2024 at 5:34:14 AM

Sign language is also pretty perfunctory in structure. Because you have to like, literally make all the symbols with your hands, someone "talking" fast in sign language is dropping a lot of words. Usually connective words or context sensitive words. Like, if you're talking about a person and you both know who you're talking about, a lot of sign languages emphasize dropping everything but what you need to get your point across, so a translator is stuck with a sentence that appears to have no subject.

And it sounds like this translator inserts context. It tries to make the result into a full sentence just based on what's in the sentence so that's not good. And going off the dictionaries just results in signs that are incredibly verbose, so to speak, and extremely unnatural sounding. It'd be like a machine translator deciding to do the Discworld Twoflower thing where he goes like "do you know where an inn/hotel/hostel/lodging/space for rent is?" because he's reading from a phrasebook and doesn't like, pick the contextually appropriate word for the situation and mostly just confuses people. It's not...technically wrong...but it is really dumb and difficult to understand.

Again, this is squarely in the space of "hey, dipshits, next time to try to reinvent this thing that always fails, actually get a large number of deaf people involved."

This has been tried before, because tech people and linguistics undergraduates always assume that sign language is just "you speak x language with your hands", when it's a genuinely different language with a different approach. You can do word for word translations, but it comes off as unnatural, and getting deaf people involved would likely result in "this is stupid" and an admonition that any like, TV program or whatever using this thing would be better served just hiring a subtitler.

Edited by Zendervai on Jan 19th 2024 at 8:37:28 AM

Not Three Laws compliant.
RainehDaze Figure of Hourai from Scotland (Ten years in the joint) Relationship Status: Serial head-patter
Figure of Hourai
#121: Jan 19th 2024 at 6:31:55 AM

tbf "it's a language" is also the problem machine translation has with spoken and written text too. Turns out people refuse to communicate in perfect textbook form. [lol]

That doesn't mean that AI isn't the most appropriate technology for on the fly translation. "People are implementing it wrong" isn't a mark against it being better than just about any other approach we've had.

Avatar Source
Zendervai Visiting from the Hoag Galaxy from St. Catharines Since: Oct, 2009 Relationship Status: Wishing you were here
Visiting from the Hoag Galaxy
#122: Jan 19th 2024 at 6:39:55 AM

In this case though, machine translation of sign language has been attempted for decades and it always crashes and burns. And a lot of it is that it's a field people care so little about that there's multiple stories of people who got jobs as interpreters while just signing complete gibberish for decades and everyone in power just ignoring all of the deaf people complaining about it.

People aren't trying to automate it because they think it will benefit from automation, they want to automate it because they think it doesn't matter and would rather just have a program they can turn on then to have to bother with the effort of hiring someone and vetting them properly.

And, again, machine translating sign language tends to either result in it inserting words and context that isn't supposed to be there or it's unnecessarily verbose to the point of being extremely difficult to understand.

It's like the people who try to reinvent the elevator and always forget about people in wheelchairs. They're just trying to shrink the footprint used by the technology because they don't think it matters and they don't get why it's important to make spaces accessible.

If the people developing an accessibility technology have consistently, for decades, had an enormous blind spot that consists of "oh, the people who would actually use this don't need to be involved", then it is being used wrong. Because it turns out that, culturally, if someone is deaf or hearing impaired, we just assume that they don't have opinions or valuable input.

Edited by Zendervai on Jan 19th 2024 at 9:45:01 AM

Not Three Laws compliant.
RainehDaze Figure of Hourai from Scotland (Ten years in the joint) Relationship Status: Serial head-patter
Figure of Hourai
#123: Jan 19th 2024 at 6:55:07 AM

... which rather obviously means the problem isn't with AI as a technology, it's with the people doing it. Which I get, but it's not an argument for 'this is a bad tool for the job', when it is the best one that we've devised. Like, remember that we've been doing machine translation of every other language for decades and that's lead to so many godawful signs all over the world, too. And that was when people did sort of care about what it said. But at least AI-powered translation of text is... well, relatively usable for basic functionality, and it's also been important in improving automated transcription too.

Translation and recognition (optical or audio) are like the one domain where you can almost always say that the problem isn't that it's the wrong technology and something else should be used, it's the people implementing it not putting in the effort and overpromising.

Avatar Source
Zendervai Visiting from the Hoag Galaxy from St. Catharines Since: Oct, 2009 Relationship Status: Wishing you were here
Visiting from the Hoag Galaxy
#124: Jan 19th 2024 at 7:17:35 AM

In this case, part of the issue is that the people this is intended for overwhelmingly hate it.

We don't actually have a good idea if it's a good fit or not, because the people designing it refuse to put anyone it's designed for on the project and the people it's designed for reject every single attempt at it.

Hell, one thing that sign language interpreters have to do is create new signs on the fly if someone says a new or unusual word. I have a really hard time believing that an AI would do anything but just spell it out.

Not Three Laws compliant.
RainehDaze Figure of Hourai from Scotland (Ten years in the joint) Relationship Status: Serial head-patter
Figure of Hourai
#125: Jan 19th 2024 at 7:20:37 AM

I'm pretty sure we've had the study come up before that AI can create new words in prose, if there was sufficient reason and impetus to develop a good implementation for signing, I see no reason it couldn't manage something.

Avatar Source

Total posts: 714
Top