An anonymous reader writes
"A German computer scientist is taking a fresh look at the 46-year old Amdahl's law, which took a first look at limitations in parallel computing with respect to serial computing. The fresh look considers software development models as a way to overcome parallel computing limitations. 'DEEP keeps the code parts of a simulation that can only be parallelized up to a concurrency of p = L on a Cluster Computer equipped with fast general purpose processors. The highly parallelizable parts of the simulation are run on a massively parallel Booster-system with a concurrency of p = H, H >> L. The booster is equipped with many-core Xeon Phi processors and connected by a 3D-torus network of sub-microsecond latency based on EXTOLL technology. The DEEP system software allows to dynamically distribute the tasks to the most appropriate parts of the hardware in order to achieve highest computational efficiency.' Amdahl's law has been revisited many times, most notably by John Gustafson."
Altering Text In eBooks To Track Pirates
"According to Wired, 'German researchers have created a new DRM feature that changes the text and punctuation of an e-book ever so slightly. Called SiDiM, which Google translates to 'secure documents by individual marking,' the changes are unique to each e-book sold. These alterations serve as a digital watermark that can be used to track books that have had any other DRM layers stripped out of them before being shared online. The researchers are hoping the new DRM feature will curb digital piracy by simply making consumers paranoid that they'll be caught if they share an e-book illicitly.' I seem to recall reading about this in Tom Clancy's Patriot Games, when Jack Ryan used this technique to identify someone who was leaking secret documents. It would be so very difficult for someone to write a little program that, when stripping the DRM, randomized a couple of pieces of punctuation to break the hash that the vendor is storing along with the sales record of the individual book."
NVIDIA To License Its GPU Tech
An anonymous reader writes
"Today in a blog post, NVIDIA's General Counsel, David Shannon, announced that the company will begin licensing its GPU cores and patent portfolio to device makers. '[I]t's not practical to build silicon or systems to address every part of the expanding market. Adopting a new business approach will allow us to address the universe of devices.' He cites the 'explosion of Android devices' as one of the prime reasons for this decision. 'This opportunity simply didn't exist several years ago because there was really just one computing device – the PC. But the swirling universe of new computing devices provides new opportunities to license our GPU core or visual computing portfolio.' Shannon points out that NVIDIA did something similar with the CPU core used in the PlayStation 3, which was licensed to Sony. But mobile seems to be the big opportunity now: 'We'll start by licensing the GPU core based on the NVIDIA Kepler architecture, the world's most advanced, most efficient GPU. Its DX11, OpenGL 4.3, and GPGPU capabilities, along with vastly superior performance and efficiency, create a new class of licensable GPU cores. Through our efforts designing Tegra into mobile devices, we've gained valuable experience designing for the smallest power envelopes. As a result, Kepler can operate in a half-watt power envelope, making it scalable from smartphones to supercomputers.'"
MySQL Man Pages Silently Relicensed Away From GPL
An anonymous reader writes
"The MariaDB blog is reporting a small change to the license covering the man pages to MySQL. Until recently, the governing license was GPLv2. Now the license reads, 'This software and related documentation are provided under a license agreement containing restrictions on use and disclosure and are protected by intellectual property laws. Except as expressly permitted in your license agreement or allowed by law, you may not use, copy, reproduce, translate, broadcast, modify, license, transmit, distribute, exhibit, perform, publish, or display any part, in any form, or by any means. Reverse engineering, disassembly, or decompilation of this software, unless required by law for interoperability, is prohibited.'"
Verizon Accused of Intentionally Slowing Netflix Video Streaming
"A recent GigaOm report discusses Verizon's 'peering' practices, which involves the exchange of traffic between two bandwidth providers. When peering with bandwidth provider Cogent starts to reach capacity, Verizon reportedly isn't adding any ports to meet the demand, Cogent CEO Dave Schaffer told GigaOm. 'They are allowing the peer connections to degrade,' Schaffer said. 'Today some of the ports are at 100 percent capacity.' Why would Verizon intentionally disrupt Netflix video streaming for its customers? One possible reason is that Verizon owns a 50% stake in Redbox, the video rental service that contributed to the demise of Blockbuster (and more recently, a direct competitor to Netflix in online streaming). If anything threatens the future of Redbox, whose business model requires customers to visit its vending machines to rent and return DVDs, it's Netflix's instant streaming service, which delivers the same content directly to their screens."
Oculus Rift Raises Another $16 Million
"It seems that the Oculus Rift virtual-reality headset caught the attention of investors after its showing at E3 this year. Spark Capital and Matrix Partners were able to push $16 million at Oculus VR in the hopes that the product will live up to the hype. The HD unit looks a bit more slick than the ski-goggles-with-a-tablet-glued-to-it prototype, but the device would look even more appealing if the next-gen consoles would commit to supporting it. (We all know how well the PS3's 'wave-stick' did as an afterthought.) That said, major titles like the 9-year-old Half-Life 2 and the 6-year-old Team Fortress 2 are getting full support for the device. Hopefully some developers are looking into support for the Oculus Rift as a launch feature, rather than an addition years after the fact. IA bit like the EAX standard from Soundblaster. That worked out well too."
KWin Maintainer: Fanboys and Trolls Are the Cancer Killing Free Software
An anonymous reader writes
"Martin Gräßlin, maintainer of the KWin window manager, writes an informative blog post about his experiences with the less favorable pockets of the Free Software community. Quoting: 'Years ago I had a clear political opinion. I was a civil-rights activist. I appreciated freedom and anything limiting freedom was a problem to me. Freedom of speech was one of the most important rights for me. I thought that democracy has to be able to survive radical or insulting opinions. In a democracy any opinion should have a right even if it's against democracy. I had been a member of the lawsuit against data preservation in Germany. I supported the German Pirate Party during the last election campaign because of a new censorship law. That I became a KDE developer is clearly linked to the fact that it is a free software community. But over the last years my opinion changed. Nowadays I think that not every opinion needs to be tolerated. I find it completely acceptable to censor certain comments and encourage others to censor, too. What was able to change my opinion in such a radical way? After all I still consider civil rights as extremely important. The answer is simple: Fanboys and trolls.'"
Google Files First Amendment Challenge Against FISA Gag Order
The Washington Post reports that Google has filed a motion challenging the gag orders preventing it from disclosing information about the data requests it receives from government agencies. The motion
cites the free speech protections of the First Amendment. "FISA court data requests typically are known only to small numbers of a company’s employees. Discussing the requests openly, either within or beyond the walls of an involved company, can violate federal law." From
the filing (PDF): "On June 6, 2013, The Guardian newspaper
published a story mischaracterizing the scope and nature of Google's receipt of and compliance with foreign intelligence surveillance requests. ... In light of the intense public interest generated by The Guardian's and Post's erroneous articles, and others that have followed them, Google seeks to increase its transparency with users and the public regarding its receipt of national security requests, if any. ... Google's reputation and business has been harmed by the false or misleading reports in the media, and Google's users are concerned by the allegation. Google must respond to such claims with more than generalities. ... In particular, Google seeks a declaratory judgment that Google as a right under the First Amendment to publish ... two aggregate unclassified numbers: (1) the total number of FISA requests it receives, if any; and (2) the total number of users or accounts encompassed within such requests."
Microsoft To Start Dumping Surface RT To Schools For $199
"In a move that will remind many of Apple in the '80s, Microsoft is going to start dumping Surface RT computers to educational institutions. In an effort to try to gain mindshare for their struggling Surface RT platform, Microsoft is giving away 10,000 Surface RTs to teachers through the International Society for Technology in Education. They're also preparing to offer $199 Surface RTs to K12 and higher education institutions. The strategy of flooding the educational market was quite successful for Apple. Unfortunately for Microsoft, today's computers require management and the Surface RT presents significant management challenges in terms of the inability to join the computer to a domain or available management tools."
With an Eye Toward Disaster, NYC Debuts Solar Charging Stations
Nerval's Lobster writes
"When hurricane Sandy pummeled New York City last fall, it left a sizable percentage of the metropolis without electricity. Residents had trouble keeping their phones and tablets charged, and often walked across whole neighborhoods to reach zones with power. Come the next disaster, at least a few citizens could communicate a little easier thanks to 25 solar-powered charging stations going up around the city. The stations—known as 'Street Charge' — are the result of a partnership between AT&T, Brooklyn design studio Pensa, and portable solar-power maker Goal Zero (with approval by the city's Parks Department). The first unit will deploy in Brooklyn's Fort Green Park on June 18, followed in short order by others in Union Square, Central Park, the Rockaways, and other locations. Each station incorporates lithium-ion batteries in addition to solar panels; charging a phone to full capacity could take as long as two hours, but the time necessary for a partial charge is much shorter. But a couple of charging stations also won't help very much if half the city is without power: In order to help mitigate the effects of the next hurricane, New York City major Michael Bloomberg has put forward a $20 billion plan for seawalls, levees, and dozens of other improvements. 'Sandy exposed weaknesses in the city's telecommunications infrastructure — including the location of critical facilities in areas that are susceptible to flooding,' reads one section of the plan's accompanying report. The city will harden the system 'by increasing the accountability of telecommunications providers to invest in resiliency and by using new regulatory authority to enable rapid recovery after extreme weather events.'"
2013 U.S. Wireless Network Tests: AT&T Fastest, Verizon Most Reliable
"For the fourth year running, PCMag sent drivers out on U.S. roads to test the nation's Fastest Mobile Networks. Using eight identical Samsung phones, the drivers tested out eight separate networks for four major carriers across 30 cities evenly spread across six regions. Using Sensorly's 2013 software, a broad suite of tests were conducted every three minutes: a 'ping' to test network latency, multi-threaded HTTP upload and download tests including separate 'time to first byte' measures, a 4MB single-threaded file download, a 2MB single-threaded file upload, the download of a 1MB Web page with 70 elements, and 100kbps and 500kbps UDP streams designed to simulate streaming media. Nearly 90,000 data cycles later, the data not only revealed the fastest networks (AT&T) and the most consistent (Verizon), but also other interesting points. The tests recorded the fastest download speed (66.11 Mbits/sec) in New Orleans and the best average in Austin (27.25 Mbits/sec), both for AT&T's LTE network. The tests also found T-Mobile's HSPA network to have the worst Average-Time-To-First-Byte, even when compared with AT&T HSPA network. Also according to the tests, Sprint's LTE network didn't even come close to competing with other LTE networks, to the point that in some cities its LTE network speed averaged less than T-Mobile's HSPA network speed."
How Ubiquitous Autonomous Cars Could Affect Society (Video)
talked with Peter Wayner about autonomous cars on June 5. He had a lot to say on this topic, to the point where we seem to be doing a whole series of interviews with him because autonomous cars might have a lot of unanticipated effects on our lives and our economy. Heck, Peter has enough to say about driverless cars to fill a book,
Future Ride, which we hope he finishes editing soon because we (Tim and Robin) want to read it. While that book is brewing, watch for some thoughts on how autonomous cars (
and delivery vans) might affect us in the near future.
First Particle Comprising Four Quarks Discovered
"Physicists have resurrected a particle that may have existed in the first hot moments after the Big Bang. Arcanely called Zc(3900), it is the first confirmed particle made of four quarks, the building blocks of much of the Universe's matter (abstract one, abstract two). Until now, observed particles made of quarks have contained only three quarks (such as protons and neutrons) or two quarks (such as the pions and kaons found in cosmic rays)."
Jon 'Maddog' Hall On Project Cauã: a Server In Every Highrise
Qedward writes with an excerpt at TechWorld about a new project from Jon "Maddog" Hall, which is about to launch in Brazil:
"The vision of Project Cauã is to promote more efficient computing following the thin client/server model, while creating up to two million privately-funded high-tech jobs in Brazil, and another three to four million in the rest of Latin America. Hall explained that Sao Paolo in Brazil is the second largest city in the Western Hemisphere and has about twelve times the population density of New York City. As a result, there are a lot of people living and working in very tall buildings. Project Cauã will aim to put a server system in the basement of all of these tall buildings and thin clients throughout the building, so that residents and businesses can run all of their data and applications remotely."
HFT Nothing To Worry About (at Least In Australia)
angry tapir writes
"Although software-driven high-frequency trading has got a pretty bad rap (being blamed for the so-called 'Flash Crash' in 2012 for example) Australia's chief financial regulator ASIC says that, in Australia at least, it's not cause for concern. After an in-depth study of HFT in Australian markets, ASIC decided to hold off on previously considered regulatory changes (such as implementing a 'pause' for some small trades)."