bravetraveler 4 days ago

I'm pre-burned on all hype. If people are excited, I know to ignore it while it's devalued in the search of value.

I don't like being this way but it's worked through several fads. I'll keep going with my Linux boxes. Y'all keep finding new window dressing.

oooyay 4 days ago

There's limited use cases for LLMs and ML pipelines that are legitimately providing business impact and value beyond marketing. The market will learn that the hard way. Especially as the environmental cost of AI begins to become more apparent you'll likely see a similar social reaction that ultimately bit the Bitcoin hype.

If you're feeling burnt out on it then just don't pay attention. There's very little reason to at this point in time.

SeanAnderson 4 days ago

I think my expectations have become more realistic as time progresses, but no, I still enjoy the experience.

I just tackled an Arduino project for the first time. It was really nice being able to have AI talk me through the process, help me write in C++ which I'm not strong at, and generally be an accessible companion. I took on a project larger than I would've done without AI, and completed it more quickly and with more personal engagement than I would've done prior to LLMs being invented.

I don't think AI is going to magically generate the ROI that the market seems to believe, but it's such a nice modifier to my development process.

  • seattle_spring 3 days ago

    I tried that with having ChatGPT walk me through some Godot (game development framework) tutorials. Almost literally everything it said was wrong, but it confidently insisted it was saying the correct steps even when I would tell it that it was mixing up features in older versions, steps from Unity, and other total blunders.

    • ffhhj 3 days ago

      I've found Poe's assistant to be much better at gamedeving.

randomdata 4 days ago

For those of us who live in a bubble, what AI hype is out there?

I recently had a problem cross my desk which the stakeholders thought might be solvable with an LLM, which may have been true, but ultimately was more easily solved using plain old boring programming techniques. Is that the hype?

While the final solution didn't need to involve LLMs or anything of that nature, I don't think said stakeholders would have been able to even envision the possibility of trying to solve said problem had semi-recent technological developments that has taken on the AI moniker not enabled them to think more creatively about what machines might be able to do. Without that, there is little chance I would have become aware of the problem someone faced.

If that's the AI hype, I'm game. Being able to get computers to do things others didn't think was possible is what brings me to the world of software development.

  • al_borland a day ago

    When you don’t get a raise next year, because you used boring programming techniques instead of AI. That’s the hype. It doesn’t matter if the problem gets solved, it just matters if the marketing team can use “AI” in the sales pitch.

    • randomdata 11 hours ago

      Why can't they use "AI" in the sales pitch?

  • taurath 3 days ago

    Hype is that companies can only raise money or keep their stock price afloat if they have some angle into the Next Big Thing, regardless of whether the Next Big Thing actually turns out to be that.

  • moooo99 3 days ago

    I would define hype as a situation where the shovel sellers make more money than the shovelers combined

    • didgetmaster 2 days ago

      I would define it as when the shovel makers try to pretend that it isn't just a great tool to dig a hole; but that it will do everything like cook your food, make your clothes, and repair your car.

wruza 4 days ago

I guess we all are burnt out of the “hype” concept and its meta (not referring to this thread sarcastically, but not excluding it either).

With all these shallow short/micro “blogging” platforms we made a couple of steps back in cultural development. It’s just too much speech to process properly, and most of it is useless if not harmful.

b20000 3 days ago

Yes very much. I have the impression that some FAANG type companies have diverted resources/headcount to AI related roles while keeping the job openings around, which maybe means if you apply to a job, and it's not an AI job, you might never hear back. One person who used to work at one of these companies confirmed this to me, that resources have been put on AI and taken away elsewhere. What I'm also concerned about is that the idea is being sold that AI will replace software engineering and of course there are many companies who love that idea and will happily buy into it regardless of whether this will ever happen. This has the effect I think that many people will switch careers and after the AI thing blows over, there will be less talent around, or the talent will be AI focused and there will be a problem to find talent that can get the job done, whatever it is.

karmakaze 3 days ago

I thought I might be, but I try to follow stories on advancement of the state of the art rather than "me too" applications of it. Novel applications are also interesting when breaking new ground. If you remove the money/corporate aspect thus ignoring the X raises $Y for Z stories it reduces a lot of noise.

What I do get tired of seeing are the it's only ML (a bunch of if-then statistics, etc) stories and also some of the "Is AI sentient yet" kinds. The latter stories will be important when non-fringe folks actually believe they are rather than trying to fill news feeds. Stories that try to analyze consciousness of some form and compare biological to machine are very interesting as it's an attempt at expanding our understanding. I suppose learning to skip over the hype is a way to deal with it.

badtension 3 days ago

Very much so, people try to sell it like it's a magical solution to all the world problems. Even if it was as amazing as people tout it to be right now (it isn't) it wouldn't make our lives better. Maybe faster, a bit cheaper (for some) but not better in the most fundamental sense. We can do better right now but choose to ignore our most important issues to chase the new shiny thing.

AI R&D definitively has its place but certainly not like this, this feels like just another hype bubble.

A good read explaining some of it: https://softwarecrisis.dev/letters/llmentalist/

darthrupert 4 days ago

I have grown to not hate but be really bored about coding and programming. AI promises an extension to my usefulness despite that boredom.

So I'm only excited about it. Traditional programming is the thing I'm really burnt out about.

rurban 4 days ago

Not burnt out, but extremely careful with marketing on AI, because this one bad actor, sama, with his series of incidents, could bring down the whole industry down as in the first AI winter. Which did cost me a very good job then.

It's good technology still, but the hype is doing more harm than good and it looks very fragile. I recommended to concentrate on different marketing terms like vision, chatbot, etc.

BerislavLopac 4 days ago

As someone nicely put it: when selling it's AI, when hiring it's ML.

  • jpl56 4 days ago

    ... and when running it's SQL

delichon 4 days ago

That's like being burnt out on the airplane hype in 1904. You have several annoying decades in front of you. I'm in slack jawed awe of it and feel grateful to have lived long enough to experience the start of it.

  • al_borland a day ago

    Imagine in 1904 if everyone told Henry Ford to stop messing around with the Model A, because air travel was the future. The automobile still had a lot of exciting years ahead of it and is still going strong 120 years later.

    AI is a tool in the tool belt, and that tool will improve over time. Just like an airplane is one tool we can use for travel. It’s a useful tool, but it isn’t always the best choice.

  • HeatrayEnjoyer 4 days ago

    All of this is correct except decades should be replaced with years. Unlike prior technologies, intelligence advances its own advances the better it gets.

paulcole 3 days ago

No! I use it everyday at work.

I hope other people are burned out on it and not using it — I’ll be happy being the only one taking all the benefits :)

rthnbgrredf 3 days ago

I'm burnt out of calling every new piece of technology that starts to get wide adoption a hype or bubble. Yes, there will be the usual phase of over excitement before it all settles, that we all know is part of human nature, so what? In the Gartner hype cycle modell I'm already in the plateau of productivity with my use of GPTs for code generation, technical troubleshooting and everyday life questions.

dotcoma 2 days ago

I'm not. I haven't tried a single LLM tool and I don't expect to anytime soon.

cryptoz 4 days ago

Not me. But I do get it, sometimes hearing the grandiose expectations gets to be too much. These days I just tune it out mostly and keep my head down working (on an AI project…)

KolenCh 3 days ago

I’m burnt out by questions like this every other day. People are complaining about trends, like AGI, “I use arch/nix btw”, “written in rust”.

Get on with it. There’s always trends you don’t like. If I felt burnt out when I see a trend I don’t like, I’ll be so burnt that I’m in an urn right now.

robador 4 days ago

Yes and no. Yes, I believe it is overhyped and in many cases it causes more problems than it solves. For instance, it's easier to create content now, but the quality is usually mediocre at best. I think that's because whatever it's used for still depends on humans for its quality. You get average Joe using ai to code or write, the output is still going to be mediocre. It's just gotten easier and faster to produce it. To me that's a net loss. Companies now sprinkling AI features on everything is more likely to make me roll my eyes, it's become a gimmick.

At the same time I do think it's an incredible tool, and I personally do use it, as a sparring partner, to do quick experiments, to explore ideas or technology I don't have experience with. For example, in my current position I found myself constantly hitting limits with excel. AI enabled me to use Python, Pandas, Sklearn and other libraries to great effect. All stuff I didn't have prior experience with. So I understand the excitement.

  • oceanplexian 3 days ago

    > You get average Joe using ai to code or write, the output is still going to be mediocre. It's just gotten easier and faster to produce it.

    The value proposition is that the vast majority of what people do is produce mediocre output work product.

    99% of people don’t even write code, they do boring things like fill in spreadsheets or tick boxes on TPS reports. Even among “skilled” labor, fields like software engineers are probably copy/pasting code for CRUD apps. The current gen AI is going to be able to trivially replace all those things as it gets put into practice, even if the next gen never materializes.

999900000999 4 days ago

AI and VR are both just getting started.

If you aren't at least using AI to generate boiler plate code your already behind.

Of course AI can make mistakes, but it's going to change everything.

  • hi-v-rocknroll 4 days ago

    VR is a zombie category like flying cars that rears its head every 10 years or so. If there were anything to it, it would already be a thing. Instead, it is mostly a comical MacGuffin of 80's and 90's sci-fi movies and sells a few widgets now and then like Meta Quest. Anyone remember VRML? Or Sega's canceled Sega VR?

    AR, OTOH, has immense, untapped potential to be ubiquitous and helpful albeit to a potentially intrusive degree if privacy isn't managed properly. The ambient, contextual-, and location-dependent helpfulness achievable of AR creates undeniable advantages.

    AI code tools cannot create much usefully-intricate, specific, or reliably correct at this point. It's helpful for completing predictable sequences like advanced code completion, but that's about the limit of it's utility right now. At this moment, wisdom and experience of good engineers cannot be substituted to create custom, reliable solutions, but this may gradually change at some point.

    Narrow AI will need to be progressively and iteratively tuned to make it more useful for specific point solutions, but there is almost zero hope of AGI ever being realized, much less a SkyNet or Johnny Number 5 scenario, without absurd computing infrastructure and ongoing electricity costs.

    • 999900000999 3 days ago

      The Quest 2 and 3 are at the price point where normal people will buy it.

      Imagine once they get it down to half the weight. The headsets are already pretty popular.

      >AI code tools cannot create much usefully-intricate, specific, or reliably correct at this point. It's helpful for completing predictable sequences like advanced code completion, but that's about the limit of it's utility right now.

      The same can be said for junior engineers. Would you rather pay 20$ a month for AI tools your senior engineers can use, or 110k for a junior engineer? I think people are underestimating how many jobs will be replaced by AI.

    • HeatrayEnjoyer 4 days ago

      >there is almost zero hope of AGI ever being realized, much less a SkyNet or Johnny Number 5 scenario, without absurd computing infrastructure and ongoing electricity costs.

      Microsoft is constructing a mammoth data center at an inflation-adjusted cost comparable to the Apollo Program.

      We already know AGI is possible with 20 watts. It's only a matter of time before it is created on another medium.

      • hi-v-rocknroll 4 days ago

        Doesn't matter. Meta and Microsoft are both doing stupid corporate shit. It's a total waste of resources and entropy. It's the equivalent of a nerd pissing contest.

        AGI doesn't apply to biological systems because it's not artificial by definition. You're falling into their fallacy that simulating neurons is going to be efficient in another form. Science has thus far shown it's very expensive and difficult when a neuron would be the simplest and best solution.

      • rsynnott 3 days ago

        I mean, "Microsoft is spending a lot of money on a thing" is not, historically, anything close to a guarantee that the thing will come to pass. Nearly all of their megaprojects after Windows NT in the 90s have, essentially, failed; I think the only counterexamples would be Xbox and Azure (maybe Surface and Bing, if you want to be _very_ charitable in your definition of success).

  • carapace 4 days ago

    s/your/you're/

    • bravetraveler 4 days ago

      I upvoted this from hidden because, in context, it tells several stories.

      The one I like is that speed and accuracy are related. Conclusions to be drawn from there.

      Normally I'd sigh at such a basic interaction and perhaps go the other way, but this was notable.

      A human in their free time is making a case to other humans about time-saving... yet couldn't be bothered to check their forms! All of this to what end?

      • carapace 3 days ago

        (Thank you! It means a lot to me that someone "got it"! Cheers and well met.)

        • bravetraveler 2 days ago

          Happy to fight the good fight :)

          One argument I'd like to leave against such 'pro AI' stances that started this thread (note: I edited the form for correctness):

          > If you aren't at least using AI to generate boiler plate code you're already behind.

          The time/effort taken to write boilerplate is inconsequential. Outsourcing it to AI feels like robbing myself of the "mental vegetables". Strength comes from struggle and all of that.

          While I'm writing the loop I'm shaking out other design details or 'feeling where the rubber meets the road'. It's a good feedback loop. Taking this away is squeezing a stone with no juice to give.

          I argue, given enough time, people outsourcing the basics to AI will find things less scientific and more magical. That overall could be a good thing... but perhaps not in R&D