Crazy Hair Dryers From the Early Years

Before the invention of hair dryers, women and men would often attach hoses to the exhaust ends of vacuum cleaners to blow-dry their hair.

In 1890, French stylist Alexandre-Ferdinand Godefroy devised a contraption combining a seat with a hood connected to a gas stove. A client would sit underneath the hood while a hand crank blew hot air from the stove over her hair.

Godefroy’s hair hood dryer was widely copied and iterated upon, and became a staple of hair salons. Variants included features such as articulable nozzles and heated coils in lieu of a single helmet.

The first patent for a handheld hair dryer was granted in 1911. Early portable dryers had a few problems, though — they were heavy, produced air barely warmer than room temperature, and had an irritating habit of electrocuting users.

Salon hair dryers remained the best option until the 1970s, when handheld dryers had advanced in aesthetics, power and safety enough to be a viable alternative.
















The US military is funding an effort to catch deepfakes and other AI trickery

Think that AI will help put a stop to fake news? The US military isn’t so sure.

The Department of Defense is funding a project that will try to determine whether the increasingly real-looking fake video and audio generated by artificial intelligence might soon be impossible to distinguish from the real thing—even for another AI system.

This summer, under a project funded by the Defense Advanced Research Projects Agency (DARPA), the world’s leading digital forensics experts will gather for an AI fakery contest. They will compete to generate the most convincing AI-generated fake video, imagery, and audio—and they will also try to develop tools that can catch these counterfeits automatically.

The contest will include so-called “deepfakes,” videos in which one person’s face is stitched onto another person’s body. Rather predictably, the technology has already been used to generate a number of counterfeit celebrity porn videos. But the method could also be used to create a clip of a politician saying or doing something outrageous.

DARPA’s technologists are especially concerned about a relatively new AI technique that could make AI fakery almost impossible to spot automatically. Using what are known as generative adversarial networks, or GANs, it is possible to generate stunningly realistic artificial imagery.

“Theoretically, if you gave a GAN all the techniques we know to detect it, it could pass all of those techniques,” says David Gunning, the DARPA program manager in charge of the project. “We don’t know if there’s a limit. It’s unclear.”

A GAN consists of two components. The first, known as the “actor,” tries to learn the statistical patterns in a data set, such as a set of images or videos, and then generate convincing synthetic pieces of data. The second, called the “critic,” tries to distinguish between real and fake examples. Feedback from the critic enables the actor to produce ever-more-realistic examples. And because GANs are designed to outwit an AI system already, it is unclear if any automated system could catch them.

GANs are relatively new, but they have taken the machine-learning scene by storm (see “The GANfather: The man who’s given machines the gift of imagination”). They can already be used to dream up very realistic imaginary celebrities or to convincingly modify images by changing a frown into a smile or turning night into day.

Walter Scheirer, a digital forensics expert at the University of Notre Dame who is involved with the DARPA project, says that the technology has come a surprisingly long way since the initiative was launched a couple of years ago. “We are definitely in an arms race,” he says.

While it has long been possible for a skilled graphics expert to produce convincing-looking forgeries, AI will make the technology far more accessible. “It’s gone from state-sponsored actors and Hollywood to someone on Reddit,” says Hany Farid, a professor at Dartmouth who specializes in digital forensics. “The urgency we feel now is in protecting democracy.”


Humanoid Robot Having A Workout

Boston Dynamics has posted new video of its advanced humanoid robot, Atlas, performing several tasks with uncannily human-like movements. In the footage, Atlas can be seen running outside in a park-like setting, over uneven terrain, and jumping a log obstructing its path. Atlas and another Boston Dynamics project, SpotMini robo-dog, inspire both admiration and fear among some who see the robots as a step closer to the bleak futures envisioned in Black Mirror and The Terminator.

Somewhat creepy.

Schools in the UK Are Removing Analog Clocks Because Students Can’t Tell Time

A head-teachers’ union in the UK recently reported that youths have become so accustomed to using digital devices that they are having trouble correctly reading time on analog clocks, forcing schools to replace them.

According to Malcolm Trobe, deputy general secretary at the Association of School and College Leaders, children and young teens aren’t as good at reading an old-fashioned clock as previous ones. Because phones, tablets and computers play such a huge role in their lives, they are constantly exposed to time in digital format, so seeing the time displayed in analog format in examination halls can be a cause of unnecessary stress for children. For this reason, some schools are removing analog clocks and replacing them with digital ones.


“The current generation aren’t as good at reading the traditional clock face as older generations,” Mr Trobe, a former headmaster, told The Telegraph. “They are used to seeing a digital representation of time on their phone, on their computer. Nearly everything they’ve got is digital so youngsters are just exposed to time being given digitally everywhere.”

Until now, it was assumed that by the time students reach secondary school, they are able to read analog clocks, but Mr. Trobe claims that this is often not the case anymore. His experienced is shared by other teachers, who recently took to social media to complain about this issue.

For example, Stephanie Keenan, head of English at Ruislip High School in north-west London, said that her school decided to replace analog clock in exam halls with digital ones, after it became clear that some year nine, ten and eleven students had difficulties reading an analog clock face.


Cheryl Quine, a head of department at Cockermouth School and chair of the West Cumbria Network, said that some children at her school couldn’t read analog clocks in exam rooms either.

“It may be a little sad if youngsters coming through aren’t able to tell the time on clock faces,” Malcolm Trobe said. One hopes that we will be teaching youngsters to read clocks, however we can see the benefit of digital clocks in exam rooms.”

To make matters worse, earlier this year, a senior pediatric doctor warned that young children are finding it increasingly difficult to use analog writing tools like pencils and pens, due to being exposed to phones and tablets all the time.

“To be able to grip a pencil and move it, you need strong control of the fine muscles in your fingers. Children need lots of opportunity to develop those skills,” head pediatric occupational therapist Sally Payne said. “It’s easier to give a child an iPad than encouraging them to do muscle-building play such as building blocks, cutting and sticking, or pulling toys and ropes. Because of this, they’re not developing the underlying foundation skills they need to grip and hold a pencil.”

All we can do is hope that modern technology never fails, otherwise we’re in big trouble.