modeler
He’s hiring a ghost writer because they are very cheap.
When a person dies, they stop needing earthly rewards. And, because a lot of great authors and writers have died, there are a lot of candidate ghost writers, like Martin Amis, Truman Capote and Barbara Cartland. A good spiritualist can summon the right auteur from beyond this mortal coil for any compositional need you have!
Did you reply before even reading the summary:
“What we found is that the average cognitive deficit was equivalent to 10 IQ points, based on what would be expected for their age, et cetera,” says Maxime Taquet at the University of Oxford.
We are discussing progress over just 4 years and adjusting for age.
MicroWakeWord is a project built on the ESPHome framework.
ESPHome is a project for building, deploying and managing microcontroller firmware (such as ESP32 devices). So, because MicroWakeWord uses ESPHome, you can easily deploy it to your preferred device.
ESPHome is deeply connected inside HomeAssistant and therefore the integration is essentially OOTB - but you have to flash ESPHome firmware on an ESP device which will probably involve soldering electronics. There are some dev kits available that contain everything you need pre-built though (like this one - no endorsement)
So a bunch of people who fail on their first attempt, and they pass the second (or third) time. So, of all people who eventually pass, 70-80% took the test twice or more.
Corollary: in any given exam, 20-50% of all exam takers are there for the second (or more) time. So the total number of first-timers is considerably less than 100% and I’m guessing that their failure rate is greater than 50%.
All junior devs should read OCs comment and really think about this.
The issue is whether is_number()
is performing a semantic language matter or checking whether the text input can be converted by the program to a number type.
The former case - the semantic language test - is useful for chat based interactions, analysis of text (and ancient text - I love the cuneiform btw) and similar. In this mode, some applications don’t even have to be able to convert the text into eg binary (a ‘gazillion’ of something is quantifying it, but vaguely)
The latter case (validating input) is useful where the input is controlled and users are supposed to enter numbers using a limited part of a standard keyboard. Clay tablets and triangular sticks are strictly excluded from this interface.
Another example might be is_address()
. Which of these are addresses? ‘10 Downing Street, London’, ‘193.168.1.1’, ‘Gettysberg’, ‘Sir/Madam’.
To me this highlights that code is a lot less reusable between different projects/apps than it at first appears.
It is terribly sad - they must live in a world of hurt.
However so many of these people actively try to hurt LGBTQ+ and trans people by inciting hate and changing laws to harm the non-straight. In particular they have been preaching that being gay/trans equates to being a child molester. This is horrific and needs to stop. Exposing the hypocrisy is essential to reducing the harm they are inflicting to real people right now
Typically you need about 1GB graphics RAM for each billion parameters (i.e. one byte per parameter). This is a 405B parameter model. Ouch.
Edit: you can try quantizing it. This reduces the amount of memory required per parameter to 4 bits, 2 bits or even 1 bit. As you reduce the size, the performance of the model can suffer. So in the extreme case you might be able to run this in under 64GB of graphics RAM.