Lenguador
That reminds me of a joke.
A museum guide is talking to a group about the dinosaur fossils on exhibit.
“This one,” he says, “Is 6 million and 2 years old.”
“Wow,” says a patron, “How do you know the age so accurately?”
“Well,” says the guide, “It was 6 million years old when I started here 2 years ago.”
From Wikipedia: this is only a 1-sigma result compared to theory using lattice calculations. It would have been 5.1-sigma if the calculation method had not been improved.
Many calculations in the standard model are mathematically intractable with current methods, so improving approximate solutions is not trivial and not surprising that we’ve found improvements.
This seems like more of an achievement for the Barbie brand than for the individual director.
Apparently Inflection AI have bought 22,000 H100 GPUs. The H100 has approximately 4x the compute for transformers as the A100. GPT4 is rumored to be 10x larger than GPT3. GPT3 takes approximately 34 days to train on 1024 A100 GPUs.
So with 22,000*4/1024=85.9375x more compute, they could easily do 10x GPT4 size in 1-2 months. Getting to 100x the size would be feasible but likely they’re banking on the claimed speedup of 3x from FlashAttention-2, which would result in about 6 months of training.
It’s crazy that these scales and timelines seem plausible.
This is an essay about the Barbie brand and its relationship to feminism and capitalism through history and the modern day. The Barbie movie is discussed but it’s not the primary focus.
NGC 1277 is unusual among galaxies because it has had little interaction with other surrounding galaxies.
I wonder if interactions between galaxies somehow converts regular matter to dark matter.
Claude 2 would have a much better chance at this because of the longer context window.
Though there are plenty of alternate/theorised/critiqued endings for Game of Thrones online, so current chatbots should have a better shot at doing a good job vs other writers who haven’t finished their series in over a decade.