Toggle light / dark theme

One way to crack this problem, according to the authors of a Perspective in this issue, is through a hybrid approach. The latest techniques in deep learning should be accompanied by a hand-in-glove pursuit of conventional physical modelling to help to overcome otherwise intractable problems such as simulating the particle-formation processes that govern cloud convection. The hybrid approach makes the most of well-understood physical principles such as fluid dynamics, incorporating deep learning where physical processes cannot yet be adequately resolved.


Studies of complex climate and ocean systems could gain from a hybrid between artificial intelligence and physical modelling.

Read more

The Elon Musk funded OpenAI non-profit has created a breakthrough system for writing high-quality text. It can write text, performs basic reading comprehension, machine translation, question answering, and summarization and all without task-specific training.

The system is able to take a few sentences of sample writing and then produce a multi-paragraph article in the style and context of the sample. This capability would let AI’s to impersonate the writing style of any person from previous writing samples.

GPT-2, is a 1.5 billion parameter Transformer that achieves state of the art results on 7 out of 8 tested language modeling datasets in a zero-shot setting, yet still simplifies (or in AI term underfits) their database called WebText. Samples from the model reflect these improvements and contain coherent paragraphs of text. These findings suggest a promising path towards building language processing systems which learn to perform tasks from their naturally occurring demonstrations.

Read more

Fake. Dangerous. Scary. Too good. When headlines swim with verdicts like those then you suspect, correctly, that you’re in the land of artificial intelligence, where someone has come up with yet another AI model.

So, this is, GPT-2, an algorithm and, whether it makes one worry or marvel, “It excels at a task known as language modeling,” said The Verge, “which tests a program’s ability to predict the next word in a given sentence.”

Depending on how you look at it, you can blame, or congratulate, a team at California-based OpenAI who created GPT-2. Their language modeling program has written a convincing essay on a topic which they disagreed with.

Read more

Our morning routine could be appended to something like “breakfast, stretching, sit on a medical examiner, shower, then commute.” If we are speaking seriously, we don’t always get to our morning stretches, but a quick medical exam could be on the morning agenda. We would wager that a portion of our readers are poised for that exam as they read this article. The examiner could come in the form of a toilet seat. This IoT throne is the next device you didn’t know you needed because it can take measurements to detect signs of heart failure every time you take a load off.

Tracking heart failure is not just one test, it is a buttload of tests. Continuous monitoring is difficult although tools exist for each test. It is unreasonable to expect all the at-risk people to sit at a blood pressure machine, inside a ballistocardiograph, with an oximeter on their fingers three times per day. Getting people to browse Hackaday on their phones after lunch is less of a struggle. When the robots overthrow us, this will definitely be held against us.

We are not sure if this particular hardware will be open-source, probably not, but there is a lesson here about putting sensors where people will use them. Despite the low rank on the glamorous scale, from a UX point of view, it is ingenious. How can we flush out our own projects to make them usable? After all, if you build a badass morning alarm, but it tries to kill you, it will need some work and if you make a gorgeous clock with the numbers all messed up …okay, we dig that particular one for different reasons.

Read more