Toggle light / dark theme

Concetta Antico is the world’s most famous tetrachromat, meaning she has four types of color receptors (cone cells) in her eyes. Most of us have three types. As a result of this mutation, Antico can see around 100 million colors, 100 times more than other people. Antico is an artist and she says that her psychedelic color paintings depict what she perceives. I wonder though what her paintings look like through her eyes. From The Guardian:

According to Dr Kimberly Jameson, a University of California scientist who has studied Antico, just having the gene – which around 15% of women have – is not alone sufficient to be a tetrachromat, but it’s a necessary condition. “In Concetta’s case … one thing we believe is that because she’s been painting sort of continuously since the age of seven years old, she has really enlisted this extra potential and used it. This is how genetics works: it gives you the potential to do things and if the environment demands that you do that thing, then the genes kick in.”[…]

While the natural world is a positive stimulant for Antico, many man-made environments, such as a large shopping centre with fluorescent lighting, have the opposite effect. “I feel very uneasy. I actually avoid going into those kinds of buildings unless I absolutely have to,” she says. “I don’t enjoy the barrage, the massive onslaught of bits of unattractive colour. I mean, there’s a difference between looking at a row of stuff in a grocery store and looking at a row of trees. It’s like, it’s ugly, and the lights are garish. It makes me not happy.”

Watch this video ad-free on Nebula: https://nebula.app/videos/polymatter-the-myth-of-chinese-efficiency.

Sources: https://pastebin.com/F2B6axnJ

Twitter: https://twitter.com/polymatters.
Reddit: https://reddit.com/r/PolyMatter.

Music by Graham Haerther (http://www.Haerther.net)
Audio editing by Eric Schneider.
Motion graphics by Vincent de Langen.
Thumbnail by Simon Buckmaster.
Writing & Direction by Evan.

This includes a paid sponsorship which had no part in the writing, editing, or production of the rest of the video.

Music by Epidemic Sound: http://epidemicsound.com.

If you download music online, you can get accompanying information embedded into the digital file that might tell you the name of the song, its genre, the featured artists on a given track, the composer, and the producer. Similarly, if you download a digital photo, you can obtain information that may include the time, date, and location at which the picture was taken. That led Mustafa Doga Dogan to wonder whether engineers could do something similar for physical objects. “That way,” he mused, “we could inform ourselves faster and more reliably while walking around in a store or museum or library.”

The idea, at first, was a bit abstract for Dogan, a 4th-year Ph.D. student in the MIT Department of Electrical Engineering and Computer Science. But his thinking solidified in the latter part of 2020 when he heard about a new smartphone model with a camera that utilizes the infrared (IR) range of the electromagnetic spectrum that the naked eye can’t perceive. IR light, moreover, has a unique ability to see through certain materials that are opaque to visible light. It occurred to Dogan that this feature, in particular, could be useful.

The concept he has since come up with—while working with colleagues at MIT’s Computer Science and Artificial Intelligence Lab (CSAIL) and a research scientist at Facebook—is called InfraredTags. In place of the standard barcodes affixed to products, which may be removed or detached or become otherwise unreadable over time, these tags are unobtrusive (due to the fact that they are invisible) and far more durable, given that they’re embedded within the interior of objects fabricated on standard 3D printers.

Almost anytime physicists announce that they’ve discovered a new particle, whether it’s the Higgs boson or the recently bagged double-charm tetraquark, what they’ve actually spotted is a small bump rising from an otherwise smooth curve on a plot. Such a bump is the unmistakable signature of “resonance,” one of the most ubiquitous phenomena in nature.

Resonance underlies aspects of the world as diverse as music, nuclear fusion in dying stars, and even the very existence of subatomic particles. Here’s how the same effect manifests in such varied settings, from everyday life down to the smallest scales.

In its simplest form, resonance occurs when an object experiences an oscillating force that’s close to one of its “natural” frequencies, at which it easily oscillates. That objects have natural frequencies “is one of the bedrock properties of both math and the universe,” said Matt Strassler, a particle physicist affiliated with Harvard University who is writing a book about the Higgs boson. A playground swing is one familiar example: “Knock something like that around, and it will always pick out its resonant frequency automatically,” Strassler said. Or flick a wineglass and the rim will vibrate a few hundred times per second, producing a characteristic tone as the vibrations transfer to the surrounding air.

👉For business inquiries: [email protected].
✅ Instagram: https://www.instagram.com/pro_robots.

https://www.youtube.com/watch?v=xppMgm2buuM

You are on the PRO Robots channel and in this video we present to your attention the news of high technology. Robots and technology for the military, Elon Musk’s tower, a new humanoid robot, new drones of unusual designs and robots for various tasks. See all the most interesting technology news in one issue! Watch the video to the end and write in the comments, which news surprised you more than others?

0:00 In this video.
0:20 Robot for garbage sorting.
0:52 Jet Suit from Gravity Industries.
1:45 SpaceX’s Mechazilla Tower.
2:15 HB1 Robot from HausBots.
2:50 Roboto from Nuro.
3:19 Alfred Robot arm.
4:15 HiPeRLab quadcopter.
4:51 ADAM Robot Barista.
5:20 Tocabi humanoid.
5:59 Warehouse robot Spider-Go.
6:43 Combination of drone and underwater robot.
7:17 Robot rover Brawler.
7:48 Stretch by Boston Dynamics.
8:31 Katy Perry shot a music video with Spot.
8:48 Robo C-2
9:20 Marker military robot.

#prorobots #robots #robot #futuretechnologies #robotics.

More interesting and useful content:

Circa 2016


90% of people who try to learn guitar quit in the first year. So when Brian Fan wanted to start strumming out lullabies for his baby daughter, he reinvented the instrument itself. Suddenly, a single finger on a button could replace four on different strings, and anyone could learn to play popular songs in a few minutes.

Meet the Magic Instruments Rhythm guitar. It might look like a toy video game controller from Guitar Hero, but it can play real music through its built-in speaker. And with the accompanying app, you’ll get instructions for how play tons of popular songs by big name artists from The Beatles to Bob Marley.

Japanese architect Sou Fujimoto has completed a museum dedicated to music topped with an undulating roof punctuated by trees in Budapest’s City Park.

Named House of Music, the 9,000-square-metre museum is dedicated to telling the history of music over the past 2,000 years.

The museum, which was built on the site of the Hungexpo Offices, is surrounded by trees within Budapest’s City Park.

Beyond that, the ECoG technology could be developed for use in the emerging field of brain-computer interfaces, which have a huge range of potential applications – from controlling a computer just by thinking, to streaming music directly to your brain.

By uncovering new knowledge about how the brain works, for example, the device could be used to interpret hand motions in new ways utilising brain wave patterns.

Musicians have been experimenting with artificial intelligence for a few years now. For example, in 2019, an AI trained on Schubert’s music completed his Unfinished Symphony and last October the Beethoven Orchestra in Bonn performed an AI-generated version of Beethoven’s last symphony.

But what are the limits of AI music? Can an AI really be considered creative? And is it possible for an AI to improvise with musicians live on stage?

To find out, researchers from France, the USA and Japan are collaborating on a study to explore the role of AI in creativity, using a combination of machine learning and social science research. The project recently received funding from the European Research Council.

One part of the study involves teaching AI how to improvise, and find out if it can be used for example in live performance with (human) musicians.

Full Story: