corporate employers, alarmed by the recent spread of minimum wage hikes have been taking measures such as these:
the McCafe Coffee Kiosk (which is basically a self-serve coffee machine for the cheap price of $2.99)...
ut the war of robot vs unskilled-worker-demanding-a-pay-raise, which the former had been expected to win by complete anihiliation of the latter hit a snag when actual robot waiters were employed, pardon, deployed in China leading to unintended consequences.
Out of three Guangzhou restaurants that used robots to serve customers, two have closed and the third has fired its robot waiters, the Workers' Daily has reported (We couldn't find a Chinese Robot Daily yet).
According to the Chinese media, customers flocked to the Heweilai Restaurant chain in the southern Chinese city when it introduced robots last year, but the chain has stopped using the machines for a number of reasons.
A staff member said the robots couldn't effectively handle soup dishes, often malfunctioned, and had to follow a fixed route that sometimes resulted in clashes. A customer also said the robots were unable to do tasks such as topping up water or placing a dish on the table.
Another restaurant in Guangzhou's Baiyun District said robots were used only because of a high turnover of waiters and waitresses.
"The robots weren't able to carry soup or other food steady and they would frequently break down. The boss has decided never to use them again," said one employee.
Zhang Yun, a vice president at the Guangdong University of Technology, said robots will be widely used within the manufacturing industry in the future, as many tasks are repetitive, but further development is needed before robots are able to work effectively in the service sector.
For now, it appears, China's minimum wage workers, and it has a few hundred million of those, will not be phased out just yet.
In the US, however, it's a different matter. Remember Boston Dynamics' Atlas? The 5' 9" tall, 180 lbs robot is perfectly suited to replace any number of fast food employees. All it needs is the McDonalds retention letter and the next stage in the war of (minimum paid) man against robot may commence.
« Last Edit: Apr 8th, 2016, 4:31pm by Sys_Config »
How to make someone look like they are saying and feeling something when they are not..The Art of Deception..is Full fledged Science..
Once upon a time, a photo of something was enough to believe it was real. Sure, you’d have to deal with the occasional Big Foot hoax, but for the most part, those who had the time or talent to create believable fakes were in the minority.
Then came the age of Photoshop. Edits and fakes are prolific enough that “FAKE!” has become the default; a dubious photo is presumed fake unless proven otherwise.
We’re not quite to that point with video. Fake videos exist in droves, obviously, but editing a video to be something it’s not introduces a bevy of challenges not found in the editing of a single still frame, and generally requires considerably more time and talent to do right. People will still yell “FAKE!” but it’ll be a quieter yell. As your Facebook feed probably proves, moderately well-faked videos have a much easier time finding believers.
That might not be the case for long.
The video up top shows a work-in-progress system called Face2Face (research paper here) being built by researchers at Stanford, the Max Planck Institute and the University of Erlangen-Nuremberg.
The short version: take a YouTube video of someone speaking like, say, George W. Bush. Use a standard RGB webcam to capture a video of someone else emoting and saying something entirely different. Throw both videos into the Face2Face system and, bam, you’ve now got a relatively believable video of George W. Bush’s face — now almost entirely synthesized — doing whatever the actor in the second video wanted the target’s face to do. It even tries to work out what the interior of their mouth should look like as they’re speaking. t’s not pixel-perfect yet — even in the relatively low-res clips we’re shown, there’s an uncanny valley effect of something being not quite right. But hot damn is it impressive (and, well, more than a little spooky) even in this early stage.
Why spooky? Technology like this will serve to make video less inherently believable. The video’s use of politicians as the editing target is pretty self-aware. In that regard, political hoaxes will hit a lot harder when it’s a video instead of a ‘shopped picture being forwarded around.
Don’t freak out too hard, though. Hoaxes have existed in every medium throughout history. This tech isn’t widely available beyond its researchers just yet; the uncanny valley challenges in stepping from “somethings-kinda-off” to pixel-perfect infallibility aren’t small ones. Just remember that, just like photos before it, being on video doesn’t mean it’s real — and that gets a lil’ bit truer
Edit: I dont wanna hear that crap..I wanna know if I can buy one now!
Edit: Sys did not just say that..I did ..Zetar..Can you tell the difference?
« Last Edit: Apr 9th, 2016, 3:45pm by Sys_Config »