Toggle light / dark theme

I guess any procedure involving the brain feels like a different category of risk to most people. You must face that anxiety every day. I think there are two types of surgical practice that really strike at the core of people’s anxiety. One is brain surgery, where you are operating on something that people see as themselves, their sense of identity, their mind. The other one is, I think, paediatric surgery, where the operation is on the thing most precious to you – your children. I think both create a dynamic where you need to work harder to create trust with your patients.

When it comes to innovation that might link a person’s mind directly with a machine, it seems as much an ethical as a medical question. Is that how you see it? Ethicists are critical in what we do. A working interface would be a real turning point in human evolution. I don’t say that with bombast or hyperbole. And just like with artificial intelligence, we need to take the greatest care in how we think about it. Whether it happens in five years or 50 years, it will happen. I wrote these two science-fiction novels to try to walk people through some of the things that could happen; for example, if others got unauthorised access to these implants, or when corporations got involved. We need to be thinking about these things now, rather than after the fact.

Was one of the motivations in writing your books to work out these things for yourself? Did you feel the same at the beginning of the process as at the end? I had certain ideas in mind when I started the books, but there was an evolution. I came to think less about that individual interface and more about the effect this technology might have on society. We need to think hard about how advances [might] not increase social division.

Read more

There is an enduring fear in the music industry that artificial intelligence will replace the artists we love, and end creativity as we know it.

As ridiculous as this claim may be, it’s grounded in concrete evidence. Last December, an AI-composed song populated several New Music Friday playlists on Spotify, with full support from Spotify execs. An entire startup ecosystem is emerging around services that give artists automated songwriting recommendations, or enable the average internet user to generate customized instrumental tracks at the click of a button.

But AI’s long-term impact on music creation isn’t so cut and dried. In fact, if we as an industry are already thinking so reductively and pessimistically about AI from the beginning, we’re sealing our own fates as slaves to the algorithm. Instead, if we take the long view on how technological innovation has made it progressively easier for artists to realize their creative visions, we can see AI’s genuine potential as a powerful tool and partner, rather than as a threat.

Read more

SAS® supports the creation of deep neural network models. Examples of these models include convolutional neural networks, recurrent neural networks, feedforward neural networks and autoencoder neural networks. Let’s examine in more detail how SAS creates deep learning models using SAS® Visual Data Mining and Machine Learning.

Deep learning models with SAS Cloud Analytic Services

SAS Visual Mining and Machine Learning takes advantage of SAS Cloud Analytic Services (CAS) to perform what are referred to as CAS actions. You use CAS actions to load data, transform data, compute statistics, perform analytics and create output. Each action is configured by specifying a set of input parameters. Running a CAS action processes the action’s parameters and data, which creates an action result. CAS actions are grouped into CAS action sets.

Read more

About a year ago, Apple made the bold proclamation that it was zeroing in on a future where iPhones and MacBooks were created wholly of recycled materials. It was, and still is, an ambitious thought. In a technologically-charged world, many forget that nearly 100 percent of e-waste is recyclable. Apple didn’t.

Named “Daisy,” Apple’s new robot builds on its previous iteration, Liam, which Apple used to disassemble unneeded iPhones in an attempt to scrap or reuse the materials. Like her predecessor, Daisy can successfully salvage a bulk of the material needed to create brand new iPhones. All told, the robot is capable of extracting parts from nine types of iPhone, and for every 100,000 devices it manages to recover 1,900 kg (4,188 pounds) of aluminum, 770 kg of cobalt, 710 kg of copper, and 11 kg of rare earth elements — which also happen to be some of the hardest and environmentally un-friendly materials required to build the devices.

In its latest environmental progress report, Apple noted:

Read more

Researchers in artificial intelligence can stand to make a ton of money. But this week, we actually know just how much some A.I. experts are being paid — and it’s a lot, even at a nonprofit.

OpenAI, a nonprofit research lab, paid its lead A.I. expert, Ilya Sutskever, more than $1.9 million in 2016, according to a recent public tax filing. Another researcher, Ian Goodfellow, made more than $800,000 that year, even though he was only hired in March, the New York Times reported.

As the publication points out, the figures are eye-opening and offer a bit of insight on how much A.I. researchers are being paid across the globe. Normally, this kind of data isn’t readily accessible. But since OpenAI is a nonprofit organization, it’s required by law to make these figures public.

Read more