|
Post by Neal on Mar 22, 2023 6:15:41 GMT -5
It is AI that can write stories, create WEB pages and draw pictures!
|
|
|
Post by Dabeagle on Mar 22, 2023 17:18:27 GMT -5
Next come the Cybermen and the Daleks.
|
|
|
Post by dgt224 on Mar 23, 2023 2:27:41 GMT -5
I think we've got a ways to go before Cybermen and Daleks are a concern - GPT-4 runs on a _very_ powerful computer, not the sort of thing that's particularly mobile at present. Although the heaviest computing goes into updating the model, its predecessor (GPT-3) is a language model with over 175 billion parameters and occupies 800 gigabytes. I suspect GPT-4 is significantly larger, although some of its improvements over GPT-3 are undoubtedly a result of tweaking its existing parameters through the use of additional training material. If I understand correctly, its operation depends on the ability to evaluate billions of very simple linked calculations very quickly - I believe it uses arrays of what are essentially video cards without the image generation hardware. A high-end video card with 32 gigabytes of memory could run 1/20 of the GPT-3 model, so 20 of them together could run the whole thing, probably. But a modern high-end video card of that sort tends to want a fair bit of power - in the neighborhood of 200 watts, so a basic setup to run GPT-3 likely wants at least 4000 watts. And GPT-4 is probably more demanding than that.
You really want to have enough memory to hold the whole model in memory - a modern SSD can supply data at about 7 gigabytes per second, which means nearly two minutes just to load the model into memory. Actually much less than that if you are able to read from more than one SSD at once, but that also involves some serious hardware, and if you can run eight SSD reads in parallel (which is really a lot), you're still looking at fifteen seconds to load the model into memory. That's not bad if you only have to do it once a day or so, but if you have to do it for every interaction, it's going to be as slow as molasses in January, to use an old cliche.
Definitely not something that's going to run on your smartphone or embedded PC in the next year or two.
|
|