|
Its not that long since I posted the last article on AI. But some things have changed during the last few month. And they went all, I can’t describe it otherwise...insane. Its a bit of a rambling, sorry for that: More and bigger data centres Some AI companies plan to build huge data centres in scale of small cities. Going big time nuclear. To power this a huge investment is planed, to build nuclear reactors numbered by the dozens. (not sure if they mean full fledged reactors or using some modular smaller design)
Just to put it into perspective, this is not a natural scale of operation. Most AI companies still play the investors game: Get as much people attached to the product, so they can run the numbers and see what price / profitability they have in future. If you got a running profitable business and want to scale up you might make bold investments, but you have also a solid base to project your future monetary expense and gain. The price of AI We don’t know the price of AI. Currently its the “free lunch time”. It might be that it won’t be a sustainable for a lot of individuals and small companies once these AI providers actually want to make profit. Here is a small estimate on some current available figures: Open AI lost 11.5 billion last quarter, that’s 125 million per day. They currently announced they have 1 million business customers. It cost around 30 bucks per month of writing That means they need 125 times the current user base at current price, OR each existing account has to pay a little less than 4000 bucks per month, if my calculation is right, to be even out without any profits. That is with current hardware and software. Now everyone with big money can estimate how much time in workforce ChatGPT actually saved them, and if its viable for their business. And this price does not include future investments (interests) in those nuclear reactors and data cities. For those who can’t afford it, like me, it means to be cut off of this technology….but are we ? Local image generators run on modest systems (even on CPU) are becomming, more or less, easy to install, and often offer more options. It might be slower to use them, but since it often offers more direct feedback than the web interfaces and are more easy to be integrated into someones pipeline, the time lost is less of an issue. Currently good video and LLMs do not run locally on machines with modest hardware, so they are save as for now. With a minimum of 6 years (lowest estimate) to build a nuclear reactor, a lot can happen. If you commit to a large data centre in that timeframe and money, you also commit to a singular strategy which might become outdated. And as someone who still knows Yahoo directory, and the omnipresent Altavista, and know what happened to them once a more efficient and easy to use system like google simply eradicated the market, I don’t put a bet on any company becoming the new market leader in AI in longterm at current time. Yes I’m still indifferent to AI, and no I don’t ignore the copyright problem, but this now for courts to decide. But I fear this won’t be judged in favour to the individual creator, but to large companies with huge assets.
|












