• The new WDWMAGIC iOS app is here!
    Stay up to date with the latest Disney news, photos, and discussions right from your iPhone. The app is free to download and gives you quick access to news articles, forums, photo galleries, park hours, weather and Lightning Lane pricing. Learn More
  • Welcome to the WDWMAGIC.COM Forums!
    Please take a look around, and feel free to sign up and join the community.

Disney making $1 billion investment in OpenAI, will allow characters on Sora AI video generator

TsWade2

Well-Known Member
As much as this is the most greatest news ever, but Disney is not giving up AI. And please keep in mind, Josh D’Amaro says people won’t be replaced by AI. Ai is a tool, not a replacement.
 

flynnibus

Premium Member
LLMs are a technological dead end, though, at least in terms of trying to create true artificial intelligence. That's something people don't talk about as much as they probably should in terms of AI investment, the bubble, etc.; any company that wants to move towards creating an actual AGI (which is often the stated end goal) has to go elsewhere and the work put in to LLMs isn't going to be very useful (if at all).

This is a "you'll never be perfect, so why try at all" kind of dismissive argument. Artificial General Intelligence is not the end goal of all.

Internally we are working on means to have models that learn how our complex software systems interact and function. We have to do things at times to make it more interpretable to the agents so they gain better and better understanding of the solution. With this knowledge, not only will they help new people understand how something functions (something that is difficult to in large systems that have been broken into smaller microservice chunks), but will be used to analyze the system for issues, will help support troubleshoot things, and will be used to help CHANGE the system for new features, etc.

Even without that full grasp fo the larger systems, AI coding tools are very effective at extending existing functionality, refactoring code, etc. Even in the most simple sense of a 'coding coach' the AI tools are highly effective at digesting large complex systems and providing aids.

Other roles can use the tools to crunch large data and create correlations and summaries - Highly useful for doing things like market research, exploring unfamiliar spaces, etc. Other roles are using agents to help manage busy queues of activities to help workers to the most important task at hand, etc.

This is the kind of stuff that our business will thrive with long before anyone has a agent that is smarter than a human. We don't need that to drive value today.
 

UNCgolf

Well-Known Member
This is a "you'll never be perfect, so why try at all" kind of dismissive argument.

That's not what I said or implied.

LLMs have plenty of productive uses. But there are also a number of things they can't do well (and will never do well due to how they function), and a significant portion of the AI investment/hype (certainly not all, and I'm not suggesting otherwise) has ignored that.
 
Last edited:

Tha Realest

Well-Known Member
The bigger picture requires people with skills that have to be developed somehow. People didn’t just jump to being a lead animator because there were skills that needed to be developed. That’s true with a lot of other fields, especially creative ones.
I don’t disagree with this. However, it is important to look at history here. One of the various reasons we don’t have hand drawn animation any longer is because, in large part, labor / development costs (difficult to edit or revise after drawn and inked) and audience taste preferences. In both instances, economics and preferences moved away from more personally curated work to that done with computers.

While some bemoan the loss of the conventional animation pipeline - and I’m one of those people who does not like the loss of that! - some modern animation by conventional means hasn’t helped the cause against AI. I do think the arguments against the “Cal Arts” style are somewhat overwrought but you also can’t deny theres a uniform unpleasantness to a lot of modern animation.

In general, there will be a loss of pipelines and apprenticeships. This will be felt. Two of our better filmmakers today got their start doing the lowest of grunt work at Corman Studios and Troma Studios.
 

ChrisFL

Premium Member
Go, go on, feel free to ignore AI. We argue about silly things like theme and sightlines, AI is something that is and will effect everyone everyday and you ain't seen nothing yet!

Oh we're seeing it, in electricity usage and bills going way up, fresh water being used for AI datacenters instead of people, the cost of computer hardware going up exponentially, the loss of jobs by CEO's who think AI can replace everyone...
 

AidenRodriguez731

Well-Known Member
This is a "you'll never be perfect, so why try at all" kind of dismissive argument. Artificial General Intelligence is not the end goal of all.

Internally we are working on means to have models that learn how our complex software systems interact and function. We have to do things at times to make it more interpretable to the agents so they gain better and better understanding of the solution. With this knowledge, not only will they help new people understand how something functions (something that is difficult to in large systems that have been broken into smaller microservice chunks), but will be used to analyze the system for issues, will help support troubleshoot things, and will be used to help CHANGE the system for new features, etc.

Even without that full grasp fo the larger systems, AI coding tools are very effective at extending existing functionality, refactoring code, etc. Even in the most simple sense of a 'coding coach' the AI tools are highly effective at digesting large complex systems and providing aids.

Other roles can use the tools to crunch large data and create correlations and summaries - Highly useful for doing things like market research, exploring unfamiliar spaces, etc. Other roles are using agents to help manage busy queues of activities to help workers to the most important task at hand, etc.

This is the kind of stuff that our business will thrive with long before anyone has an agent that is smarter than a human. We don't need that to drive value today.
Hi, game dev here. AI sucks at teaching coding and that should be obvious. Overly complicated explanations that slow down your development. You don’t learn through trial and error but rather you do what the AI tells you to do.

If you’re basically just following a teacher the entire time and have them on the “test” but you haven’t learned much of anything
 

flynnibus

Premium Member
Hi, game dev here. AI sucks at teaching coding and that should be obvious. Overly complicated explanations that slow down your development. You don’t learn through trial and error but rather you do what the AI tells you to do.

If you’re basically just following a teacher the entire time and have them on the “test” but you haven’t learned much of anything
If that's your hot take as an actual coder... then either your tools are garbage, or you are not really trying.

As someone who develops product in a billion dollar business unit - our take is the polar opposite of yours.
 

Alice a

Well-Known Member
Yes it is true - but again, it's about a disruption that comes with any transition. Some people over rotate, some think they know what the new roles that are needed (but will be wrong to a degree) and overtime the industry will re-calibrate and rebalance with the tools and needs they have.

The hype is "AI is destroying creativity and jobs" -- The reality is most businesses are thinking "AI will change how things are produced" and want to use tools to replace what could otherwise just be expensive human labor.
And of course, since employers have more available funds after replacing entry-level, “expensive human labor”, we’ll all get paid more now, right?

Right?

 

AidenRodriguez731

Well-Known Member
If that's your hot take as an actual coder... then either your tools are garbage, or you are not really trying.

As someone who develops product in a billion dollar business unit - our take is the polar opposite of yours.
No I'm not going to give my full effort into learning from something that fundamentally makes the environment a worse off place. Billion dollar operations are very often not very efficient anyway. I don't need to be told by a machine what to do. I critically think myself and will research information myself which is going to be a dying art if people just take the cheap mass produced and expoitable way.
 

flynnibus

Premium Member
No I'm not going to give my full effort into learning from something that fundamentally makes the environment a worse off place.

So at 22… and just entering the workforce… during a transition where entry level work is being choked out by new tech… you’re going to refuse to get savvy in the stuff that is going to be part of every interview?? (because every person will be expected to find the best use of it for their role)

1774532860681.gif


Maybe you went to school for game development… but you won’t be employed for long as one with that attitude (if ever)
 

HauntedPirate

Park nostalgist
Premium Member
So at 22… and just entering the workforce… during a transition where entry level work is being choked out by new tech… you’re going to refuse to get savvy in the stuff that is going to be part of every interview?? (because every person will be expected to find the best use of it for their role)

View attachment 913477

Maybe you went to school for game development… but you won’t be employed for long as one with that attitude (if ever)
Game development could be his side hustle. He's posted multiple times in the past that he's a nurse and makes decent money doing that.
 

MrPromey

Well-Known Member
That’s not a good thing! That nonsense before the recipe is human creativity! It’s usually creativity producing something bad and uninteresting, but it’s still creativity, and some small portion of it may be a precursor to the production of good, meaningful work. Quite a few great writers began doing technical writing or other “meaningless” work - ask AI for examples! (And then check whatever it says). Even for some of the writers who never grow and mature, that writing often still means something to them, enriches them in some way that matters.

I never, ever want to read work produced by AI. I don’t care if it’s just an e-mail apologizing for clogging the toilet, I want to hear a human.
LLMs are really popular with leadership in my company.

I watch senior management show how they use it to craft more professional sounding emails and then I see others using it to summarize or "bullet point" those emails. It's literally "AI" being tasked with writing professional sounding content the author struggles to write on their own and and then on the other end, "AI" dumbing down that same content into summaries or talking points by the person who gets the email.

The funny part is, they all know they're all doing it. They encourage us to do it for efficency. Somehow they miss the irony, completely.

Meanwhile, these people are becoming worse and worse at face-to-face conversations and being able to talk intelligently about what we're doing because they're really not paying attention.

After the pandemic, there was a push to get everyone back in the office. That's started to shift because leadership doesn't want to be there. In Teams meetings, it's a lot easier to not pay attention and feed the transcript through a LLM to summarize or write work instructions based off of, etc." than it is to sit in a room and actually engage in the conversation, learn, provide constructive feedback, etc. Instead they can be using that valuable time to work on ChatGPT punching up their emails, I guess.

Again, this is no secret. I've been given tips on doing this kind of thing myself so I understand that's what's going on when I see their faces that seem entirely engaged in something other than the meeting while I'm talking or presenting. When it started, I found it incredibly off-putting by the early adopters of this process. I now don't take it personally that they seem to be ignoring me the whole time. They call it multi-tasking while on the call.

What I wonder is, what happens when everyone in the meeting wants to do that and nobody is willing to engage in the discussion to provide the content that gets transcribe and summarized and repackaged and re-credited? Or are these tools only indented for the "important" people with the thinking all delegated to those they feel are replaceable/expendable?

It seems like CEOs and senior executives should be the ones most afraid of AI and yet somehow, they all seem to feel confident it'll be everyone but them replaced by it in the not too distant future. Meanwhile, I see them using it in ways that kind of makes most of them seem redundant.
 
Last edited:

Register on WDWMAGIC. This sidebar will go away, and you'll see fewer ads.

Back
Top Bottom