The robotics firm has revealed its latest creation – a dog-like robot designed to help around the house. Best known for its impressive humanoid ‘Atlus’ and infamous gas-guzzling ‘BigDog’ robots, the company has now come up with something a little more consumer-friendly. Known as ‘SpotMini’, this quadrupedal contraption looks a bit like a small dog and is equipped with a special arm attachment that can enable it to do everything from dropping empty cans in the bin to putting dirty glasses in to a dishwasher. A recently released YouTube video also shows how the robot is able to climb up stairs and recover from a fall – a feature hilariously demonstrated thanks to a conveniently placed banana skin. Whether the robot will ever be available for consumer purchase however, epecially given Boston Dynamics’ recent financial difficulties, remains to be seen.
DynaTAC is a series of cellular telephones manufactured by Motorola, Inc. from 1983 to 1994. The Motorola DynaTAC 8000X commercial portable cellular phone received approval from the U.S. FCC on September 21, 1983. A full charge took roughly 10 hours, and it offered 30 minutes of talk time. It also offered an LED display for dialing or recall of one of 30 phone numbers. It was priced at $3,995 in 1984, its commercial release year, equivalent to $9,831 in 2019. DynaTAC was an abbreviation of “Dynamic Adaptive Total Area Coverage.” It weighed 1.75 lb., stood 13 in. high.
Several models followed, starting in 1985 with the 8000s, and continuing with periodic updates of increasing frequency until 1993’s Classic II. The DynaTAC was replaced in most roles by the much smaller Motorola MicroTAC when it was first introduced in 1989, and by the time of the Motorola StarTAC’s release in 1996, it was obsolete.
Martin Cooper of Motorola made the first publicized handheld mobile phone call on a prototype DynaTAC model on April 3, 1973. This is a reenactment in 2007.
The first cellular phone was the culmination of efforts begun at Bell Labs, which first proposed the idea of a cellular system in 1947, and continued to petition the Federal Communications Commission (FCC) for channels through the 1950s and 1960s, and research conducted at Motorola. In 1960, electrical engineer John F. Mitchell became Motorola’s chief engineer for its mobile communication products. Mitchell oversaw the development and marketing of the first pager to use transistors.
Motorola had long produced mobile telephones for cars that were large and heavy and consumed too much power to allow their use without the automobile’s engine running. Mitchell’s team, which included Martin Cooper, developed portable cellular telephony, and Mitchell was among the Motorola employees granted a patent for this work in 1973; the first call on the prototype was completed, reportedly, to a wrong number.
While Motorola was developing the cellular phone itself, during 1968–1983, Bell Labs worked on the system called AMPS, while others designed cell phones for that and other cellular systems. Martin Cooper, a former general manager for the systems division at Motorola, led a team that produced the DynaTAC 8000x, the first commercially available cellular phone small enough to be easily carried, and made the first phone call from it. Martin Cooper was the first person to make an analog cellular mobile phone call on a prototype in 1973.
On July 8, 2022, Canadian telecom provider Rogers Communications experienced a major service outage; it affected Rogers’ cable internet and cellular networks, including subsidiary brands Rogers Wireless, Fido, and Chatr. It also impacted internet service providers with wholesale access to the Rogers network, such as TekSavvy, as well as various other information systems nationwide that rely on the Rogers network, including Interac and some federal government services. Multiple international web monitoring companies observed the outage.
Rogers had begun to slowly restore service that evening, but CEO Tony Staffieri stated there was no estimated time for when services would become fully operational again. The next day, Rogers stated that it had restored service to the “vast majority” of its customers; however not all service has been restored across the country.
A report by Cloudflare suggested that the outage was due to internal, rather than external, causes. It identified spikes in BGP updates, as well as withdrawals of IP prefixes, noting that Rogers was not advertising its presence, causing other networks to not find the Rogers network. As of the day after the outage, the cause remained unknown. Public Safety Canada stated that it was not a cyberattack. The outage was later said to be caused by a maintenance upgrade that caused routers to malfunction.
Creating a machine that can perform the delicate work of picking an apple is tricky – and farmworkers say it could be a benefit.
Robots can do a lot. They build cars in factories. They sort goods in Amazon warehouses. Robotic dogs can, allegedly and a little creepily, make us safer by patrolling our streets. But there are some things robots still cannot do – things that sound quite basic in comparison. Like picking an apple from a tree.
“It’s a simple thing” for humans, says robotics researcher Joe Davidson. “You and I, we could close our eyes, reach into the tree. We could feel around, touch it, and say ‘hey, that’s an apple and the stem’s up here’. Pull, twist. We could do all that without even looking.”
Creating a robotic implement that can simply pick an apple and drop it into a bin without damaging it is a multimillion-dollar effort that has been decades in the making. Teams around the world have tried various approaches. Some have developed vacuum systems to suck fruit off trees. Davidson and his colleagues turned to the human hand for inspiration. They began their efforts by observing professional fruit pickers, and are now working to replicate their skilled movements with robotic fingers.
Their work could help to transform agriculture, turning fruit-picking – a backbreaking, time-consuming human task – into one that’s speedy and easier on farm workers.
These efforts have gained impetus recently as researchers point to the worsening conditions for farm workers amid the climate crisis, including extreme heat and wildfire smoke, and also a shortage of workers in the wake of the pandemic. The technology could lead to better working conditions and worker safety. But that outcome depends on how robots are deployed in fields, farm workers’ organizations say.
Trump was barred from the platform in January 2021 in the final days of his presidency amid unrest following the Jan. 6 attack on the U.S. Capitol.
Tech billionaire Elon Musk said Tuesday that he would allow former President Donald Trump back on Twitter after Musk completes his plan to buy the company, giving the most concrete example yet of how his vision of social media would play out in reality.
Musk said at an event sponsored by The Financial Times that it was “morally bad” and “foolish in the extreme” for Twitter to “permanently suspend” Trump in January 2021 after Trump’s supporters violently stormed the U.S. Capitol, according to a video of the event posted online.
“I do think that it was not correct to ban Donald Trump,” Musk, the CEO of Tesla, said at the newspaper’s Future of the Car event by remote video.
“I think that was a mistake, because it alienated a large part of the country and did not ultimately result in Donald Trump not having a voice,” he said, citing Trump’s newly launched tech platform, Truth Social.
“I would reverse the permanent ban,” Musk said.
They’re tall. They’re totally absurd. And they’re everywhere.
Over the past few decades, as cellphone networks have grown, thousands of antenna towers designed to look vaguely like trees have been built across the United States. Although these towers are intended to camouflage a tower’s aesthetic impact on the landscape, they typically do the opposite: most look like what an alien from a treeless planet might create if told to imagine a tree.
A “pine” in Colorado. (Brian Brainerd/the Denver Post via Getty Images)
In the 1980s, soon after cellphone companies started building antennas in the United States, they sought to hide them, as well, often in response to aesthetic complaints from local residents.
Initially, most concealed antennas were simply hidden on church steeples or water towers, but in 1992, a company called Larson Camouflage — which had previously made fake habitats for Disney World and museums — built a “pine” tower in Denver. The world was changed forever.
Soon afterward, companies in South Carolina and South Africa began building similar “trees.” In the US, the Telecommunications Act of 1996 restricted municipalities’ ability to block tower construction, so as demand for cell service spread, it meant that towers would inevitably be built in historic districts and other areas where locals might object.
A “tree” in Cambridge, Massachusetts. (Darren McCollester/Getty Images)
Still, municipalities have often tried to block construction, leading companies to offer “trees” instead of towers as a compromise. Some localities even require new towers be camouflaged as part of their zoning requirements.
There’s no good data on how many of these “trees” now exist, but in 2013, Mergen estimated there were between 1,000 and 2,000 nationwide. The company Stealth Concealment says it builds about 350 new “trees” per year. They’re most often built in suburbs, where residents have the time and urge to war with companies over new towers, and there’s enough incentive for carriers to invest in “trees.”
Why these “trees” look so ridiculous
There are actually good reasons why these towers seldom actually look like real trees.
One is height. Towers are built to hold antennas higher than surrounding structures to ensure good reception, so they have to be taller than what’s nearby. This is why you often see surreally tall “pines” or “palms” towering over normal trees.
Another is cost. These “trees” are normal cellphone towers, which are then sent to companies like Larson or Stealth Concealment for plastic, fiberglass, or acrylic “bark,” “branches,” and “needles” to be added. This process is customized and expensive: it can add $100,000 or so to the baseline $150,000 cost of a tower.
As Ryan McCarthy of Larson told Bernard Mergen, “A pine tree that has 200 branches will be more appealing than one of the same height that has 100. However, the customer will not only incur the cost of 100 extra branches, but the extra wind load from the branches will also require that the pole be designed more stoutly.”
This is also why you so seldom see towers designed as deciduous trees, even in areas where they’re much more common than pines — their branching structure makes them more complex and more expensive to build. Pines, palms, and cacti are much easier to approximate in plastic and fiberglass.
In terms of blending in, the most successful towers are probably “saguaros,” which can plausibly be built in deserts where there are no trees that they have to tower over — and don’t have expensive branches or needles that need to be attached.
Statue in Poland
In the 1960’s there was a science fiction TV show called Voyage to the Bottom of the Sea. The show centered around the crew aboard a huge nuclear powered submarine named the Seaview. One of the more interesting features of the show was a mini flying sub that was housed in the nose of the Seaview. This little sub could bolt away from the Seaview, propel itself through the water to the surface, and take to the skies. Then land back on the water and go submersible and dock back up with the Seaview.
Americans love their high-technology gadgets. And the military is often at the forefront when it comes to developing cutting edge high technology systems. And believe it or not the U.S. military is looking into a real Flying Sub!
Irwin Allen, the creator of Voyage to the Bottom of the Sea would be very proud indeed.
GUILLEMOTS and gannets do it. Cormorants and kingfishers do it. Even the tiny insect-eating dipper does it. And if a plan by the Pentagon’s Defense Advanced Research Projects Agency (DARPA) succeeds, a remarkable airplane may one day do it too: plunge beneath the waves to stalk its prey, before re-emerging to fly home.
The DARPA plan calls for a stealthy aircraft that can fly low over the sea until it nears its target, which could be an enemy ship, or a coastal site such as a port. It will then alight on the water and transform itself into a submarine that will cruise under water to within striking distance, all without alerting defences.
That, at least, is the plan. The agency is known for taking on brain-twistingly difficult challenges. So what about DARPA’s dipper? Is it a ridiculous dream? “A few years ago I would have said that this is a silly idea,” says Graham Hawkes, an engineer and submarine designer based in San Francisco. “But I don’t think so any more.”
DARPA, which has a $3 billion annual budget, has begun to study proposed designs. In the next year or so it could begin allocating funding to developers. Though the agency itself is unwilling to comment, Hawkes and others working on rival designs have revealed to New Scientist how they would solve the key problems involved in building a plane that can travel underwater – or, to put it another way, a flying submarine.
The challenges are huge, not least because planes and submarines are normally poles apart. Aircraft must be as light as possible to minimise the engine power they need to get airborne. Submarines are heavyweights with massive hulls strong enough to resist crushing forces from the surrounding water. Aircraft use lift from their wings to stay aloft, while submarines operate like underwater balloons, adjusting their buoyancy to sink or rise. So how can engineers balance the conflicting demands? Could a craft be designed to dive into the sea like a gannet? And how will it be propelled – is a jet engine the best solution, both above and below the waves?
The Utah Data Center (UDC), also known as the Intelligence Community Comprehensive National Cybersecurity Initiative Data Center, is a data storage facility for the United States Intelligence Community that is designed to store data estimated to be on the order of exabytes or larger. Its purpose is to support the Comprehensive National Cybersecurity Initiative (CNCI), though its precise mission is classified. The National Security Agency (NSA) leads operations at the facility as the executive agent for the Director of National Intelligence. It is located at Camp Williams near Bluffdale, Utah, between Utah Lake and Great Salt Lake and was completed in May 2014 at a cost of $1.5 billion.
Critics believe that data center has the capability to process “all forms of communication, including the complete contents of private emails, cell phone calls, and Internet searches, as well as all types of personal data trails—parking receipts, travel itineraries, bookstore purchases, and other digital ‘pocket litter’.” In response to claims that the data center would be used to illegally monitor email of U.S. citizens, in April 2013 an NSA spokesperson said, “Many unfounded allegations have been made about the planned activities of the Utah Data Center, … one of the biggest misconceptions about NSA is that we are unlawfully listening in on, or reading emails of, U.S. citizens. This is simply not the case.”
In April 2009, officials at the United States Department of Justice acknowledged that the NSA had engaged in large-scale overcollection of domestic communications in excess of the United States Foreign Intelligence Surveillance Court’s authority, but claimed that the acts were unintentional and had since been rectified.
In August 2012, The New York Times published short documentaries by independent filmmakers titled The Program, based on interviews with former NSA technical director and whistleblower William Binney. The project had been designed for foreign signals intelligence (SIGINT) collection, but Binney alleged that after the September 11 terrorist attacks, controls that limited unintentional collection of data pertaining to U.S. citizens were removed, prompting concerns by him and others that the actions were illegal and unconstitutional. Binney alleged that the Bluffdale facility was designed to store a broad range of domestic communications for data mining without warrants.
Documents leaked to the media in June 2013 described PRISM, a national security computer and network surveillance program operated by the NSA, as enabling in-depth surveillance on live Internet communications and stored information. Reports linked the data center to the NSA’s controversial expansion of activities, which store extremely large amounts of data. Privacy and civil liberties advocates raised concerns about the unique capabilities that such a facility would give to intelligence agencies. “They park stuff in storage in the hopes that they will eventually have time to get to it,” said James Lewis, a cyberexpert at the Center for Strategic and International Studies, “or that they’ll find something that they need to go back and look for in the masses of data.” But, he added, “most of it sits and is never looked at by anyone.”
The UDC was expected to store Internet data, as well as telephone records from the controversial NSA telephone call database, MAINWAY, when it opened in 2013.
In light of the controversy over the NSA’s involvement in the practice of mass surveillance in the United States, and prompted by the 2013 mass surveillance disclosures by ex-NSA contractor Edward Snowden, the Utah Data Center was hailed by The Wall Street Journal as a “symbol of the spy agency’s surveillance prowess”.
Binney has said that the facility was built to store recordings and other content of communications, not only for metadata.
According to an interview with Snowden, the project was initially known as the Massive Data Repository within NSA, but was renamed to Mission Data Repository due to the former sounding too “creepy”.
An article by Forbes estimates the storage capacity as between 3 and 12 exabytes in the near term, based on analysis of unclassified blueprints, but mentions Moore’s Law, meaning that advances in technology could be expected to increase the capacity by orders of magnitude in the coming years.
Toward the end of the project’s construction it was plagued by electrical problems in the form of “massive power surges” that damaged equipment. This delayed its opening by a year.
The finished structure is characterized as a Tier III Data Center, with over a million square feet, that cost over 1.5 billion dollars to build. Of the million square feet, 100,000 square feet are dedicated to the data center. The other 900,000 square feet are utilized as technical support and administrative space.
Inside the internet: Google allows first ever look at the eight vast data centres that power the online world
- Data centres range from vast warehouses in Iowa to a converted paper mill in Finland
- Buildings are so large Google even provides bicycles for engineers to get around them
- Street View tour of North Carolina facility reveals Stormtrooper standing guard
Google has given a rare glimpse inside the vast data centres around the globe that power its services.
They reveal an intricate maze of computers that process Internet search requests, show YouTube video clips and distribute email for millions of people.
With hundreds of thousands of servers, colourful cables and even bicycles so engineers can get around quickly, they range from a converted paper mill in Finland to custom made server farms in Iowa.
One of Google’s server farms in Council Bluffs, Iowa, which provides over 115,000 square feet of space for servers running services like Search and YouTube
‘Very few people have stepped inside Google’s data centers, and for good reason: our first priority is the privacy and security of your data, and we go to great lengths to protect it, keeping our sites under close guard,’ the firm said.
‘While we’ve shared many of our designs and best practices, and we’ve been publishing our efficiency data since 2008, only a small set of employees have access to the server floor itself.
‘Today, for the first time, you can see inside our data centers and pay them a virtual visit.
‘On Where the Internet lives, our new site featuring beautiful photographs by Connie Zhou, you’ll get a never-before-seen look at the technology, the people and the places that keep Google running.’
The site features photos from inside some of the eight data centers that Google Inc. already has running in the U.S., Finland and Belgium.
Google is also building data centers in Hong Kong, Taiwan, Singapore and Chile.
Virtual tours of a North Carolina data center also will be available through Google’s ‘Street View’ service, which is usually used to view photos of neighborhoods around the world.
The photographic access to Google’s data centers coincides with the publication of a Wired magazine article about how the company builds and operates them.
The article is written by Steven Levy, a journalist who won Google’s trust while writing ‘In The Plex,’ a book published last year about the company’s philosophy and evolution.
Google colour codes its servers depending on their location, while piping in the buildings is coded depending on what it carries – with cool water in blue tubes and warm in red
Google’s Douglas County data centre in Georgia is so large the firm provides Google branded bicycles for staff to get around on
The data centers represent Google’s nerve center, although none are located near the company’s headquarters in Mountain View, Calif.
As Google blossomed from its roots in a Silicon Valley garage, company co-founders Larry Page and Sergey Brin worked with other engineers to develop a system to connect low-cost computer servers in a way that would help them realize their ambition to provide a digital roadmap to all of the world’s information.
Initially, Google just wanted enough computing power to index all the websites on the Internet and deliver quick responses to search requests. As Google’s tentacles extended into other markets, the company had to keep adding more computers to store videos, photos, email and information about their users’ preferences.
A street view tour published by Google also reveals a hidden surprise – A Stormtrooper standing guard over a server in Google’s North Carolina server farm
The insights that Google gathers about the more than 1 billion people that use its services has made the company a frequent target of privacy complaints around the world.
Google studies Internet search requests and Web surfing habits in an effort to gain a better understanding of what people like. The company does this in an effort to show ads of products and services to the people most likely to be interested in buying them. Advertising accounts for virtually all of Google’s revenue, which totaled nearly $23 billion through the first half of this year.
Even as it allows anyone with a Web browser to peer into its data centers, Google intends to closely guard physical access to its buildings. The company also remains cagey about how many computers are in its data centers, saying only that they house hundreds of thousands of machines to run Google’s services.
Google’s need for so many computers has turned the company a major electricity user, although management says it’s constantly looking for ways to reduce power consumption to protect the environment and lower its expenses.
Here hundreds of fans funnel hot air from the server racks into a cooling unit to be recirculated in Oklahoma. The green lights are the server status LEDs reflecting from the front of the servers
The Iowa campus network room, where routers and switches allow data centers to talk to each other. The fiber cables run along the yellow cable trays near the ceiling.
Even the water pipes reflect Google’s brand: These colorful pipes are responsible for carrying water in and out of an Oregon data center. The blue pipes supply cold water and the red pipes return the warm water back to be cooled.
In Hamina, Finland, Google chose to renovate an old paper mill to take advantage of the building’s infrastructure as well as its proximity to the Gulf of Finland’s cooling waters.
Google’s server farm in Douglas County, Iowa
Denise Harwood, a Google Engineer, diagnoses an overheated CPU. For more than a decade, Google has built some of the world’s most efficient servers.
Each server rack has four switches, connected by a different coloured cable. Colours are kept the same throughout data centres so staff know which one to replace in case of failure.