25+ yr Java/JS dev
Linux novice - running Ubuntu (no windows/mac)

  • 0 Posts
  • 11 Comments
Joined 9 months ago
cake
Cake day: October 14th, 2024

help-circle

  • Most tedious part I’ve seen so far is there is so fucking much to upgrade. At least 20 buildings and probably a fair bit more with construction, and they all go C, B, A, S and every upgrade takes time and can only be done one at a time, so getting your town built up is pretty tedious. Time will tell if the rewards are worth it. At least you can get your rewards from all settlements (max 4 I think) just stopping at one. So once you’re built up maybe it’s fine?



  • I think it’s fair to discuss the energy. I’m not sure where the math comes from that 100 words takes .14kWh. My video card uses 120W pegged and can generate 100 words in let’s say a nice round 2 minutes. So that works out to 4W or .004kWh. But of course they are running much more advanced and hungry models, and this is probably generating the text and then generating the voice, and I don’t know what that adds. I do know that an AI tool I use added a voice tool and it added nothing to cost, so it was small enough for them to eat, but also the voices are eh and there are much better voice models out there.

    So that’s fine, I can pretty well define the lower bounds of what a line of text could cost, energy-wise. But this strategy doesn’t get us closer to an actual number. What might be helpful… is understanding it from EA’s perspective. They are doing this to increase their bottom line through driving customer engagement and excitement, because I haven’t heard anything about this costing the customer anything.

    So whatever the cost is of all the AI they are using, has to be small enough for them to simply absorb in the name of increased player engagement leading to more purchases. The number I just found is $1.2 billion in profit annually. Fuck, that’s a lot of money. What do you think they might spend on this? Do you think it would be as high as 2%? I’ll be honest, I really don’t know. So lets say they are going to spend $24million on generative AI and let’s just assume for a second that all goes to power.

    I just checked and the average for 1KWh nationally is $0.1644 but let’s cut that in half assuming they cut some good deals? (I’m trying to be completely fair in these numbers so disagree if you like. I’m writing this before doing all the math so I don’t even know where this is going.) That looks like about 291 million KWh (or… that’s just 291 GWh, right?)

    I read global energy usage is estimated at 25,500 TWh, and check my math that works out to about 1/87,000th of the world’s annual electricity consumption. Kinda a lot for a single game, but it’s pretty popular.

    But the ask is how that compares to video cards and… let’s be honest this is going to be a very slippery, fudge-y number. I was quoted 1.5 million daily players (and I see other sources report up to 30 million which is really wide, but lets go with the lower number). So the question is, how long do they play on average, and how much power do their video cards use? I see estimates of 6-10 hours per week and 8-10 hours per week. Let’s make it really easy and assume 7 hours per week or 1 hour per day.

    I have a pretty low end video card, but it’s probably still comparable to or better than some of the devices connecting to fortnight. I don’t have a better number to use, so I’m going to use 120W. There should be a lot of players higher than that, but also probably a lot of switches and whatnot that are probably lower power. Feel free to disagree.

    So 1.5m players x 1 hour per day = 120MWh x 365 = 43.8GWh.

    By these numbers the AI uses about 6x the power of the GPUs. So there is that. But also I think I have been extremely generous with these numbers everywhere except maybe the video card wattage which I really don’t have any idea how to estimate. Would EA spend 2% expecting to recoup that in revenue? What if it’s 1%? What if it’s .5%? At .5% they are getting pretty close.

    Or if the number of daily players is 15 million instead of 1.5, that alone is enough to tip the scale the other way.

    And device power is honestly a wild-ass guess. You could tell me the average is 40W or 250W and I’d have no real basis to argue.

    If you have any numbers or suggestions to make any of this more accurate, I’m all ears. The current range of numbers would lean toward me being wrong, but my confidence in any of this is low enough that I consider the matter unresolved. I also didn’t dive into how much of AI cost is power vs. infrastructure. If only half the cost of AI is power (and it’s probably lower than that) it changes things significantly.

    I’m going to stick with my assertion, but my confidence is lower than it was.



  • Okay. So, your position is that 6 year olds are going to join Fortnite to spam the funny-man-speak button and because of that AI energy usage will be higher? Okay. Maybe. I’d argue the novelty of AI wears thin really quickly once you interact with it a lot, but I’ll grant you some folks might remain excited by AI beyond reason.

    So now they are logging into Fortnite and rather than playing the actual game they are just going to talk to characters? It doesn’t make a lot of sense to me. But once we throw out the other commenter’s numbers and suppose it’s not 7 generations to equal 30 minutes of play, maybe it’s 20. Maybe it’s 40. Maybe it’s 100. I honestly don’t know. But we’re definitely in the realm where I think betting the video card uses more energy than the AI for a given player (and all video cards use more energy than AI for all given players) is a perfectly reasonable position to take.

    I bet that is the case. I don’t know it. I can’t prove it right or wrong without actual numbers. But based on my ability to generate images and text locally on a shit video card, I am sticking with my bet.



  • What I said was I’ll bet one person uses more power running the game than the AI uses to respond to them. Just that.

    Then you started inventing scenarios and moving goalposts to comparing one single video card to an entire data center. I guess because you didn’t want to let my statement go unchallenged, but you had nothing solid to back you up. You’re the one that posted 6500 joules, which you supported, and I appreciate that, but after that it’s all just supposition and guesses.

    You’re right that it’s almost certainly higher than that. But I can generate text and images on my home PC. Not at the quality and speed of OpenAI or whatever they have on the back-end, but it can be done on my 1660. So my suggestion that running a 3D game consumes more power than generating a few lines seems pretty reasonable.

    But I know someone who works for a company that has an A100 used for serving AI. I’ll ask and see if he has more information or even a better-educated guess than I do, and if I find out I’m wrong, I won’t suggest otherwise in the future.


  • We know that most of the closed source models are way more complicated, so let’s say they take 3 times the cost to generate a response.

    This is completely arbitrary and supposition. Is it 3x “regular” response? I have no idea. How do you even arrive at that guess? Is a more complex prompt exponential more expensive? Linearly? Logarithmically? And how complex are we talking when system prompts themselves can be 10k tokens?

    Generating an AI voice to speak the lines increases that energy cost exponentially. MIT found that generating a grainy, five-second video at 8 frames per second on an open source model took about 109,000 joules

    Why did you go from voice gen to video gen? I mean I don’t know whether video gen takes more joules or not but there’s no actual connection here. You just decided that a line of audio gen is equivalent to 40 frames of video. What if they generate the text and then use conventional voice synthesizers? And what does that have to do with video gen?

    If these estimates are close

    Who even knows, mate? You’ve been completely fucking arbitrary and, shocker, your analysis supports your supposition, kinda. How many Vader lines are you going to get in 30 minutes? When it’s brand new probably a lot, but after the luster wears off?

    I’m not even telling you you’re wrong, just that your methodology here is complete fucking bullshit.

    It could be as low as 6500 joules (based on your link) which changes the calculus to 60 lines per half hour. Is it that low? Probably not, but that is every bit as valid as your math and I’m even using your numbers without double checking.

    At the end of the day maybe I lose the bet. Fair. I’ve been wondering for a bit how they actually stack up, and I’m willing to be shown. But I suspect using it for piddly shit day to day is a drop in the bucket compared to all the mass corporate spam. But I’m aware it’s nothing but a hypothesis and I’m willing to be proven wrong. But not based on this.



  • I’m not sure I’d wonder any more about those developers than anyone else.

    Without getting into TMI I very much enjoy non consent fantasy to the point where I had a CNC relationship with my wife for the first five years. And two things I’ve learned about myself are: I’m not physically capable of even play rape, and that having the power was much more fun and exciting than exercising the power.

    So to me it feels perfectly normal to be aroused by fantasy rape but not at all by the manifestation of it. I’m sure it’s not mainstream or anything, and there are a lot of people out there, like Andrew Tate, who are exactly what they say they are.

    I think it’s fine to look into it and see which is the case, but I personally wouldn’t expect to find any girls in basements who aren’t there consensually based on the content of their games.