[tags ]
Like My pages and find out why I have more than 9,000 likes, friends, followers & fans....
Why I don't use Facebook...
For additional information Click Here! .or. Click Here!
Any feedback would be great. "Most important Article of your Life so make sure you read this and Leave comments and right here is just one of the articles of 1100 articles across multiple subjects that are bringing me "Prospects and Leads" and YOU need these too. If you want to see us LIVE, just click ( The Longest Continuous Running Webcast Online, bar none...( Get 100% FREE ~50K+ Premium Visitors ( Mostly Proven Buyers ) for your site, and ~10MM+ for the site of your choice! ( website will NOT mention this) ) ):
WorldsLongestRunningWebcast.com/?id=5570434, WorldprofitAssociates.com/t13.cfm?id=5570434 .or. http://bigdogmoneymaker.com/3a0
... if you're ready to run with the Big Dawgs and ask the Sr. Mentor Monitor in the Live Business Center how you can get these articles for yourself already written for you. [category ]
[tags ]
Like My pages and find out why I have more than 9,000 likes, friends, followers & fans....
Why I don't use Facebook...
For additional information Click Here! .or. Click Here!
Any feedback would be great. "Most important Article of your Life so make sure you read this and Leave comments and right here is just one of the articles of 1100 articles across multiple subjects that are bringing me "Prospects and Leads" and YOU need these too. If you want to see us LIVE, just click ( The Longest Continuous Running Webcast Online, bar none...( Get 100% FREE ~50K+ Premium Visitors ( Mostly Proven Buyers ) for your site, and ~10MM+ for the site of your choice! ( website will NOT mention this) ) ):
WorldsLongestRunningWebcast.com/?id=5570434, WorldprofitAssociates.com/t13.cfm?id=5570434 .or. http://bigdogmoneymaker.com/3a0
... if you're ready to run with the Big Dawgs and ask the Sr. Mentor Monitor in the Live Business Center how you can get these articles for yourself already written for you. br>
The Era of Fake News & Video Begins
"The digital manipulation of video may make the current era of “fake news” seem quaint"
--E. Faro
In a dank corner of the internet, it is possible to find actresses from Game of Thrones or Harry Potter engaged in all manner of sex acts. Or at least to the world the carnal figures look like those actresses, and the faces in the videos are indeed their own. Everything south of the neck, however, belongs to different women. An artificial intelligence has almost seamlessly stitched the familiar visages into pornographic scenes, one face swapped for another. The genre is one of the cruelest, most invasive forms of identity theft invented in the internet era. At the core of the cruelty is the acuity of the technology: A casual observer can’t easily detect the hoax.This development, which has been the subject of much hand-wringing in the tech press, is the work of a programmer who goes by the nom de hack “deepfakes.” And it is merely a beta version of a much more ambitious project. One of deepfakes’s compatriots told Vice’s Motherboard site in January that he intends to democratize this work. He wants to refine the process, further automating it, which would allow anyone to transpose the disembodied head of a crush or an ex or a co-worker into an extant pornographic clip with just a few simple steps. No technical knowledge would be required. And because academic and commercial labs are developing even more-sophisticated tools for non-pornographic purposes—algorithms that map facial expressions and mimic voices with precision—the sordid fakes will soon acquire even greater verisimilitude.
The internet has always contained the seeds of postmodern hell. Mass manipulation, from clickbait to Russian bots to the addictive trickery that governs Facebook’s News Feed, is the currency of the medium. It has always been a place where identity is terrifyingly slippery, where anonymity breeds coarseness and confusion, where crooks can filch the very contours of selfhood. In this respect, the rise of deepfakes is the culmination of the internet’s history to date—and probably only a low-grade version of what’s to come.
Vladimir Nabokov once wrote that reality is one of the few words that means nothing without quotation marks. He was sardonically making a basic point about relative perceptions: When you and I look at the same object, how do you really know that we see the same thing? Still, institutions (media, government, academia) have helped people coalesce around a consensus—rooted in a faith in reason and empiricism—about how to describe the world, albeit a fragile consensus that has been unraveling in recent years. Social media have helped bring on a new era, enabling individuated encounters with the news that confirm biases and sieve out contravening facts. The current president has further hastened the arrival of a world beyond truth, providing the imprimatur of the highest office to falsehood and conspiracy.
But soon this may seem an age of innocence. We’ll shortly live in a world where our eyes routinely deceive us. Put differently, we’re not so far from the collapse of reality.
We cling to reality today, crave it even. We still very much live in Abraham Zapruder’s world. That is, we venerate the sort of raw footage exemplified by the 8 mm home movie of John F. Kennedy’s assassination that the Dallas clothier captured by happenstance. Unedited video has acquired an outsize authority in our culture. That’s because the public has developed a blinding, irrational cynicism toward reporting and other material that the media have handled and processed—an overreaction to a century of advertising, propaganda, and hyperbolic TV news. The essayist David Shields calls our voraciousness for the unvarnished “reality hunger.”
Scandalous behavior stirs mass outrage most reliably when it is “caught on tape.” Such video has played a decisive role in shaping the past two U.S. presidential elections. In 2012, a bartender at a Florida fund-raiser for Mitt Romney surreptitiously hit record on his camera while the candidate denounced “47 percent” of Americans—Obama supporters all—as enfeebled dependents of the federal government. A strong case can be made that this furtively captured clip doomed his chance of becoming president. The remarks almost certainly would not have registered with such force if they’d merely been scribbled down and written up by a reporter. The video—with its indirect camera angle and clink of ambient cutlery and waiters passing by with folded napkins—was far more potent. All of its trappings testified to its unassailable origins.
Donald Trump, improbably, recovered from the Access Hollywood tape, in which he bragged about sexually assaulting women, but that tape aroused the public’s passions and conscience like nothing else in the 2016 presidential race. Video has likewise provided the proximate trigger for many other recent social conflagrations. It took extended surveillance footage of the NFL running back Ray Rice dragging his unconscious wife from a hotel elevator to elicit a meaningful response to domestic violence from the league, despite a long history of abuse by players. Then there was the 2016 killing of Philando Castile by a Minnesota police officer, streamed to Facebook by his girlfriend. All the reports in the world, no matter the overwhelming statistics and shattering anecdotes, had failed to provoke outrage over police brutality. But the terrifying broadcast of his animalistic demise in his Oldsmobile rumbled the public and led politicians, and even a few hard-line conservative commentators, to finally acknowledge the sort of abuse they had long neglected.
Fabricated videos will create new suspicions about everything we watch. Politicians will exploit those doubts.
That all takes us to the nub of the problem. It’s natural to trust one’s own senses, to believe what one sees—a hardwired tendency that the coming age of manipulated video will exploit. Consider recent flash points in what the University of Michigan’s Aviv Ovadya calls the “infopocalypse”—and imagine just how much worse they would have been with manipulated video. Take Pizzagate, and then add concocted footage of John Podesta leering at a child, or worse. Falsehoods will suddenly acquire a whole new, explosive emotional intensity.
But the problem isn’t just the proliferation of falsehoods. Fabricated videos will create new and understandable suspicions about everything we watch. Politicians and publicists will exploit those doubts. When captured in a moment of wrongdoing, a culprit will simply declare the visual evidence a malicious concoction. The president, reportedly, has already pioneered this tactic: Even though he initially conceded the authenticity of the Access Hollywood video, he now privately casts doubt on whether the voice on the tape is his own.
In other words, manipulated video will ultimately destroy faith in our strongest remaining tether to the idea of common reality. As Ian Goodfellow, a scientist at Google, told MIT Technology Review, “It’s been a little bit of a fluke, historically, that we’re able to rely on videos as evidence that something really happened.”
The collapse of reality isn’t an unintended consequence of artificial intelligence. It’s long been an objective—or at least a dalliance—of some of technology’s most storied architects. In many ways, Silicon Valley’s narrative begins in the early 1960s with the International Foundation for Advanced Study, not far from the legendary engineering labs clumped around Stanford. The foundation specialized in experiments with LSD. Some of the techies working in the neighborhood couldn’t resist taking a mind-bending trip themselves, undoubtedly in the name of science. These developers wanted to create machines that could transform consciousness in much the same way that drugs did. Computers would also rip a hole in reality, leading humanity away from the quotidian, gray-flannel banality of Leave It to Beaver America and toward a far groovier, more holistic state of mind. Steve Jobs described LSD as “one of the two or three most important” experiences of his life.
Fake-but-realistic video clips are not the end point of the flight from reality that technologists would have us take. The apotheosis of this vision is virtual reality. VR’s fundamental purpose is to create a comprehensive illusion of being in another place. With its goggles and gloves, it sets out to trick our senses and subvert our perceptions. Video games began the process of transporting players into an alternate world, injecting them into another narrative. But while games can be quite addictive, they aren’t yet fully immersive. VR has the potential to more completely transport—we will see what our avatars see and feel what they feel. Several decades ago, after giving the nascent technology a try, the psychedelic pamphleteer Timothy Leary reportedly called it “the new LSD.”
Life could be more interesting in virtual realities as the technology emerges from its infancy; the possibilities for creation might be extended and enhanced in wondrous ways. But if the hype around VR eventually pans out, then, like the personal computer or social media, it will grow into a massive industry, intent on addicting consumers for the sake of its own profit, and possibly dominated by just one or two exceptionally powerful companies. (Facebook’s investments in VR, with its purchase of the start-up Oculus, is hardly reassuring.)
The ability to manipulate consumers will grow because VR definitionally creates confusion about what is real. Designers of VR have described some consumers as having such strong emotional responses to a terrifying experience that they rip off those chunky goggles to escape. Studies have already shown how VR can be used to influence the behavior of users after they return to the physical world, making them either more or less inclined to altruistic behaviors.
Researchers in Germany who have attempted to codify ethics for VR have warned that its “comprehensive character” introduces “opportunities for new and especially powerful forms of both mental and behavioral manipulation, especially when commercial, political, religious, or governmental interests are behind the creation and maintenance of the virtual worlds.” As the VR pioneer Jaron Lanier writes in his recently published memoir, “Never has a medium been so potent for beauty and so vulnerable to creepiness. Virtual reality will test us. It will amplify our character more than other media ever have.”
Perhaps society will find ways to cope with these changes. Maybe we’ll learn the skepticism required to navigate them. Thus far, however, human beings have displayed a near-infinite susceptibility to getting duped and conned—falling easily into worlds congenial to their own beliefs or self-image, regardless of how eccentric or flat-out wrong those beliefs may be. Governments have been slow to respond to the social challenges that new technologies create, and might rather avoid this one. The question of deciding what constitutes reality isn’t just epistemological; it is political and would involve declaring certain deeply held beliefs specious.
Few individuals will have the time or perhaps the capacity to sort elaborate fabulation from truth. Our best hope may be outsourcing the problem, restoring cultural authority to trusted validators with training and knowledge: newspapers, universities. Perhaps big technology companies will understand this crisis and assume this role, too. Since they control the most-important access points to news and information, they could most easily squash manipulated videos, for instance. But to play this role, they would have to accept certain responsibilities that they have so far largely resisted.
In 2016, as Russia used Facebook to influence the American presidential election, Elon Musk confessed his understanding of human life. He talked about a theory, derived from an Oxford philosopher, that is fashionable in his milieu. The idea holds that we’re actually living in a computer simulation, as if we’re already characters in a science-fiction movie or a video game. He told a conference, “The odds that we’re in ‘base reality’ is one in billions.” If the leaders of the industry that presides over our information and hopes to shape our future can’t even concede the existence of reality, then we have little hope of salvation
>>> paypal.me/JohnSilva/ <<<<<
>>>> paypal.me/JohnSilva/ <<<<<
Since you’re here …
… we have a small favour to ask. More people are reading My Blog than ever but advertising revenues across the media are falling fast. And unlike many newsletter organisations, we haven’t put up a paywall – we want to keep our journalism as open as we can. So you can see why we need to ask for your help. The My Blog’s independent, investigative journalism takes a lot of time, money and hard work to produce. But we do it because we believe our perspective matters – because it might well be your perspective, too.
" I appreciate there not being a paywall: it is more democratic for the media to be available for all and not a commodity to be purchased by a few. I’m happy to make a contribution so others with less means still have access to information."
Thos. Sweden
If everyone who reads our reporting, who likes it, helps fund it, our future would be much more secure. For as little as $1, you can support the My Blog – and it only
>>>> paypal.me/JohnSilva/ <<<<<My thanks in advance, for your consideration, attention and time spent.
Cheers,
805 291 6470 Mobile (VM)
"The best day of your life is the one on which you decide your life is your own. No apologies or excuses. No one to lean on, rely on, or blame. The gift is yours -- it is an amazing journey -- and you alone are responsible for the quality of it. This is the day your life really begins." -- B Moawad, author
https://www.today.com/news/sheryl-sandberg-today-other-facebook-data-breaches-possible-t126579
...Sheryl Sandberg on TODAY: Other Facebook data breaches 'possible'
The Facebook COO discussed steps the company is taking to secure user data.
Share:From: Jeff Bullas <jeff@jeffbullas.com>
Date: Sun, Mar 25, 2018 at 5:12 PM
Subject: Why I don't use Facebook to get my news
To: joaoa.dsilva2017@gmail.com
|
|
Facebook believes the data of up to 87 million people was improperly shared with the political consultancy Cambridge Analytica - many more than previously disclosed. The BBC has been told that about 1.1 million of them are UK-based. The overall figure had been previously quoted as being 50 million by the whistleblower Christopher Wylie.
Facebook chief Mark Zuckerberg said "clearly we should have done more, and we will going forward".
Zuckerberg: I'm still the man to lead Facebook During a press conference he said that he had previously assumed that if Facebook gave people tools, it was largely their responsibility to decide how to use them.
But he added that it was "wrong in retrospect" to have had such a limited view. "Today, given what we know... I think we understand that we need to take a broader view of our responsibility," he said. "That we're not just building tools, but that we need to take full responsibility for the outcomes of how people use those tools as well."
Mr Zuckerberg also announced an internal audit had uncovered a fresh problem. Malicious actors had been abusing a feature that let users search for one another by typing in email addresses or phone numbers into Facebook's search box. As a result, many people's public profile information had been "scraped" and matched to the contact details, which had been obtained from elsewhere.
Facebook has blocked now blocked the facility. "It is reasonable to expect that if you had that [default] setting turned on, that in the last several years someone has probably accessed your public information in this way," Mr Zuckerberg said. New numbers The estimates of how many people's data had been exposed were revealed in a blog by the tech firm's chief technology officer, Mike Schroepfer.
The BBC has also learned that Facebook now estimates that about 305,000 people had installed the This Is Your Digital Life quiz that had made the data-harvesting possible. The previously suggested figure had been 270,000. About 97% of the installations occurred within the US. However, just over 16 million of the total number of users affected are thought to be from other countries.
* Zuckerberg to testify before US committee
* Facebook chief fires back at Apple's Tim Cook
* Facebook haunted by 'ugly truth' memo
A spokeswoman for the UK's Information Commissioner's Office told the BBC that it was continuing to assess and consider the evidence before deciding what steps to take. What is the controversy about? Facebook has faced intense criticism after it emerged that it had known for years that Cambridge Analytica had collected data from millions of its users, but had relied on the London-based firm to self-certify that it had deleted the information.
Cambridge Analytica: The story so far
Cambridge Analytica said it had bought the information from the creator of the This Is Your Digital Life app without knowing that it had been obtained improperly. The firm says it deleted all the data as soon as it was made aware of the circumstances. But Channel 4 News has since reported that at least some of the data in question is still in circulation despite Cambridge Analytica insisting it had destroyed the material.
During Mr Zuckerberg's press conference, Cambridge Analytica tweeted it had only obtained data for 30 million individuals - not 87 million - from the app's creator, and again insisted it had deleted all records.
"Cambridge Analytica licensed data from GSR for 30 million individuals, not 87 million. We did not receive more than 30 million records from research company GSR."
— Cambridge Analytica (@CamAnalytica) April 4, 2018
End of Twitter post by @CamAnalytica
The latest revelations came several hours after the US House Commerce Committee announced that Facebook's founder, Mark Zuckerberg, would testify before it on 11 April. Facebook's share price has dropped sharply in the weeks since the allegations emerged. Wide-ranging changes
In his Wednesday blog post, Mr Schroepfer detailed new steps being taken by Facebook in the wake of the scandal.
They include:
* a decision to stop third-party apps seeing who is on the guest lists of Events pages and the contents of messages posted on them
* a commitment to only hold call and text history logs collected by the Android versions of Messenger and Facebook Lite for a year. In addition, Facebook said the logs would no longer include the time of the calls
* a link will appear at the top of users' News Feeds next week, prompting them to review the third-party apps they use on Facebook and what information is shared as a consequence
An alert will remind users they can remove any apps they no longer want to access their data Facebook has also purportedly published proposed new versions of its terms of service and data use policy. The documents are longer than the existing editions in order to make the language clearer and more descriptive.
Tinder users affected
Another change the company announced involved limiting the type of information that can be accessed by third-party applications. Immediately after the changes were announced, however, users of the widely popular dating app Tinder were hit by login errors, leaving them unable to use the service.
"A technical issue is preventing users from logging into Tinder. We apologize for the inconvenience and are working to have everyone swiping again soon."
— Tinder (@Tinder) April 4, 2018 Report
End of Twitter post by @Tinder
Tinder relies on Facebook to manage its logins. Users reported that they had been signed out of the app and were unable to log in again. Instead, the app repeatedly asks for more permissions to access a user's Facebook profile information. Many were quick to link the outage to the changes announced by Facebook.
"Y'all I just checked on my account and this is real. Facebook just broke Tinder. This is about to be America's loneliest Wednesday night in several years t.co/5KHe763wGY"
— Casey Newton (@CaseyNewton) April 4, 2018 Report
End of Twitter post by @CaseyNewton
Fake News
The Cambridge Analytica scandal follows earlier controversies about "fake news" and evidence that Russia tried to influence US voters via Facebook.
* Mr Zuckerberg has declined to answer questions from British MPs.
* Mr Zuckerberg now regrets saying it was a "pretty crazy idea" that fake news on Facebook could have helped Donald Trump become president
When asked about this by the BBC, he said he had decided that his chief technology officer and chief product officer should answer questions from countries other than the US. He added, however, that he had made a mistake in 2016 by dismissing the notion that fake news had influenced the US Presidential election.
"People will analyse the actual impact of this for a long time to come," he added. "But what I think is clear at this point is that it was too flippant and I should never have referred to it as crazy." Related Topics
The Era of Fake Video Beginsm/b> The digital manipulation of video may make the current era of “fake news” seem quaint. Edmon de Har
No comments:
Post a Comment