Monday, 23 January 2012
Here are a interesting reads from the recent past. Below each link is a snipet from the article to provide some context to what the article is about.
-
Surgical robots: The kindness of strangers | The Economist
Robot-assisted surgery today is dominated by the da Vinci Surgical System, a device that scales down a surgeon’s hand movements in order to allow him to perform operations using tiny incisions. That leads to less tissue damage, and thus a quicker recovery for patients. Thousands of da Vincis have been made, and they are reckoned to be used in over 200,000 operations a year around the world, most commonly hysterectomies and prostate removals.
None of that is true of the Raven. This device—originally developed for the American army by Dr Hannaford and Jacob Rosen of the University of California, Santa Cruz, as a prototype for robotic surgery on the battlefield—is compact, light and cheap (relatively speaking) at around $250,000. More importantly for academics, it is also the first surgical robot to use open-source software. Its Linux-based operating system allows anyone to modify and improve the original code, creating a way for researchers to experiment and collaborate.
-
BBC News - Can a company live forever?
The past few years have seen previously unthinkable corporate behemoths - from financial firms such as Lehman Brothers to iconic car manufacturers such as Saab - felled by economic turmoil or by unforgiving customers and tough rivals.
And do not put away the black garb yet - the pace of corporate funerals is set to pick up.
The average lifespan of a company listed in the S&P 500 index of leading US companies has decreased by more than 50 years in the last century, from 67 years in the 1920s to just 15 years today, according to Professor Richard Foster from Yale University.
Today's rate of change "is at a faster pace than ever", he says.
Professor Foster estimates that by 2020, more than three-quarters of the S&P 500 will be companies that we have not heard of yet.
-
Health Care Is Next Frontier for Big Data - WSJ.com
Big Data—the ability to collect, process and interpret massive amounts of information—is one of today's most important technological drivers. While companies see it as a way of detecting weak market signals, one of the biggest potential areas of application for society is health care.
Historically, health care has been delivered by one doctor looking at one patient with only the information the doctor has at that time. But how much better if the doctor had access to information about thousands, or even tens of thousands, of people?
Acquiring medical data has, historically, been problematic. It is wrapped in layers of regulations and stringent safeguards and is expensive to collect.
It is also not representative of the general population, for the problem with health care is that only ill people use it. If you want to know what is going on in the general population ill people aren't terribly useful; they are, after all, ill.
-
Understanding SOPA: A Simple Q&A for Understanding the Online Piracy Debate - WSJ.com
It will undermine free speech and due process, says one side. It will protect America's creative class from thieves, says the other. But what's really in the Stop Online Piracy Act? A guide:
-
Managing the business risks of open innovation - McKinsey Quarterly - Strategy - Innovation
Several years ago, something interesting happened in the infrastructure software sector: IBM and a number of other companies pledged some of their own patents to the public to create IP-free zones in parts of the value chain. They did so when a 2004 report showed that Linux, the open-source operating system that had emerged as a viable, low-cost alternative to established operating systems, such as Microsoft Windows and Unix, was inadvertently infringing on more than 250 patents.1 By voluntarily pledging not to enforce hundreds of IBM’s own patents so long as users of the IP were pursuing only open-source purposes, the company led the creation of an alliance of patent holders dependent on (and willing to defend) open-source software against lawsuits.2 One result: IBM substantially increased the share of its new products based on Linux.
-
Technological change: The last Kodak moment? | The Economist
Strange to recall, Kodak was the Google of its day. Founded in 1880, it was known for its pioneering technology and innovative marketing. “You press the button, we do the rest,” was its slogan in 1888.
By 1976 Kodak accounted for 90% of film and 85% of camera sales in America. Until the 1990s it was regularly rated one of the world’s five most valuable brands.
Then came digital photography to replace film, and smartphones to replace cameras. Kodak’s revenues peaked at nearly $16 billion in 1996 and its profits at $2.5 billion in 1999. The consensus forecast by analysts is that its revenues in 2011 were $6.2 billion. It recently reported a third-quarter loss of $222m, the ninth quarterly loss in three years. In 1988, Kodak employed over 145,000 workers worldwide; at the last count, barely one-tenth as many. Its share price has fallen by nearly 90% in the past year (see chart).
-
Financial terrorism: The war on terabytes | The Economist
THE financial industry has done such a good job of bringing itself to its knees over the past four years that it is easy to overlook the threats it faces from outside. High among them is electronic attack. In 2010 Symantec, a cybersecurity firm, estimated that three-quarters of all “phishing” attacks, in which people are deceived into surrendering private details such as account numbers, are aimed at the finance sector. Bob Greifeld, the boss of NASDAQ, has described his bourse as being under “literally constant attack”.
Many of these assaults are carried out by hackers bent on mischief. Some are the work of organised criminal groups in pursuit of loot. But plenty of people fret that some attackers are aiming to cause more serious damage.
-
Defending Privacy at the U.S. Border: A Guide for Travelers Carrying Digital Devices | Electronic Frontier Foundation
For doctors, lawyers, and many business professionals, these border searches can compromise the privacy of sensitive professional information, including trade secrets, attorney-client and doctor-patient communications, research and business strategies, some of which a traveler has legal and contractual obligations to protect. For the rest of us, searches that can reach our personal correspondence, health information, and financial records are reasonably viewed as an affront to privacy and dignity and inconsistent with the values of a free society.
-
Data mining without prejudice
The graph of the relationship between two variables in a dataset could take any shape: For a company’s hourly employees, the graph of hours worked to wages would approximate a straight line. A graph of flu incidence versus time, however, might undulate up and down, representing familiar seasonal outbreaks, whereas adoption of a new technology versus time might follow a convex curve, starting off slowly and ramping up as the technology proves itself. An algorithm for mining large datasets needs to be able to recognize any such relationship; that’s what Reshef means by generality.
Equitability is a little more subtle. If you actually tried to graph workers’ hours against wages, you probably wouldn’t get a perfectly straight line. But linear relationships, undulating relationships or curved relationships with the same amount of noise should all score equally well. That’s equitability.
-
High Scalability - How Twitter Stores 250 Million Tweets a Day Using MySQL
Jeremy Cole, a DBA Team Lead/Database Architect at Twitter, gave a really good talk at the O'Reilly MySQL conference: Big and Small Data at @Twitter, where the topic was thinking of Twitter from the data perspective.
One of the interesting stories he told was of the transition from Twitter's old way of storing tweets using temporal sharding, to a more distributed approach using a new tweet store called T-bird, which is built on top of Gizzard, which is built using MySQL.
-
Bill Joy's greatest gift to man – the vi editor – The Register
Out of all of Bill Joy's contributions to technology, users appear most fond of one of the simplest - the vi editor.
Joy leaves a lasting legacy of work both in the general technology domain and at Sun Microsystems. Among Joy's list of achievements are BSD Unix, NFS, UltraSPARC designs and some work on Java. But it's vi, created in 1976, that really captured Reg readers' hearts.
"Bill's greatest gift to mankind was left off his list of achievements (in your article)... the vi editor," writes reader Matthew Hawkins in Australia. "I can live without NFS, Java and related technologies. I'm not sure if I can live without vi."
Matthew is not alone in his feelings. Other readers called vi, "Joy's lasting contribution to humanity" and agreed they could not have worked without it.
To do vi justice, we turn to Linux Magazine, which has one of the best accounts of how Joy came up with this little gem.
-
Donald Knuth - Computer scientist - Family history - Web of Stories
Donald Knuth's history. Enough Said.
-
Avoiding Innovation's Terrible Toll - WSJ.com
The corporation isn't a sturdy species.
Given today's increased pace of technological change, even 40 years is going to start to seem like a really long time.
The wave of creative destruction looming over companies like Eastman Kodak Co., Blockbuster Inc., Barnes & Noble Inc. and the record labels has been focusing the minds of American executives on two questions: Are large companies able to innovate quickly enough in an age of rapid disruption? And if they can, how do they do it?
Business leaders, academics and venture capitalists say the large companies that do manage to survive are ruthless about change. The most successful ones aren't afraid to cannibalize their big revenue generators to build new businesses.
They often make frequent—but, crucially, small—acquisitions that bring in new technologies and open new markets. And there's always the unpredictable role of luck in business—both good and bad.
-
Pasta visualization « Sander Huisman
More than a year ago, I plotted a whole bunch of pastas, using Mathematica. It was more like a challenge; can I plot the weirdest pastas in a couple of lines? And indeed most of the pastas could be visualized in less than 5 lines. Here is one example.
Also, see Pasta Inspires Scientists to Use Their Noodle - NYTimes.com.
The Strange Birth and Long Life of Unix
IEEE Spectrum has a long read [1] on the history of Unix and how it evolved to where it is today. It makes for interesting reading and has some interesting insights, stories and anecdotes. It is especially memorable on account of Dennis Ritchie's recent passing away[2].
There's also another older post related to the history of Unix[3].
URL[1]: http://spectrum.ieee.org/computing/software/the-strange-birth-and-long-life-of-unix/0
URL[2]: https://www.opengear.net/blog/2011/10/14#Obituary-DennisRitchie-20111014
URL[3]: https://www.opengear.net/blog/2011/07/22#UnixHistory-20110722
More than just digital quilting
The Economist has some interesting coverage[1] on the Maker movement and talks about the do-it-yourself culture, open hardware and open source and how it can foster innovation and spur science and technology.
A few choice blurbs from the article:
The maker movement is both a response to and an outgrowth of digital culture, made possible by the convergence of several trends. New tools and electronic components let people integrate the physical and digital worlds simply and cheaply. Online services and design software make it easy to develop and share digital blueprints. And many people who spend all day manipulating bits on computer screens are rediscovering the pleasure of making physical objects and interacting with other enthusiasts in person, rather than online. Currently the preserve of hobbyists, the maker movement’s impact may be felt much farther afield.
Start with hardware. The heart of New York’s Maker Faire was a pavilion labelled with an obscure Italian name: “Arduino” (meaning “strong friend”). Inside, visitors were greeted by a dozen stands displaying credit-card-sized circuit boards. These are Arduino micro-controllers, simple computers that make it easy to build all kinds of strange things: plants that send Twitter messages when they need watering, a harp made of lasers, an etch-a-sketch clock, a microphone that serves as a breathalyser, or a vest that displays your speed when riding a bike.
Such projects are taking off because Arduino is affordable (basic boards cost $20), can easily be extended using add-ons called “shields” to add new functions and has a simple programming system that almost anyone can use. “Not knowing what you are doing is an advantage,” says Massimo Banzi, an Italian engineer and designer who started the Arduino project a decade ago to enable students to build all kinds of contraptions. Arduino has since become popular—selling around 200,000 units in 2011—because Mr Banzi made the board’s design “open source” (which means that anyone can download its blueprints and build their own versions), and because he has spent much time and effort getting engineers all over the world involved with the project.
Applying the open-source approach to hardware has also driven the development of the maker movement’s other favourite piece of kit, which could be found everywhere at the Maker Faire in New York: 3D printers. These machines are another way to connect the digital and the physical realms: they take a digital model of an object and print it out by building it up, one layer at a time, using plastic extruded from a nozzle. The technique is not new, but in recent years 3D printers have become cheap enough for consumers. MakerBot Industries, a start-up based in New York, now sells its machines for $1,300. The output quality is rapidly improving thanks to regular upgrades, many of them suggested by users.
URL[1]: http://www.economist.com/node/21540392/print
Friday, 22 July 2011The Unix revolution — thank you, Uncle Sam? By Matthew Lasar
Ars Technica has a nice read[1] on the history of Unix and some inside insights into how it developed and evolved. Here's a brief snippet from the article:
This November, the Unix community has another notable anniversary to celebrate: the 40th birthday of the first edition[2] of Ken Thompson and Dennis Ritchie's Unix Programmers Manual[3], released in November 1971. Producing the document was no easy task, because at that point the Unix operating system grew by the week; budding aficionados added new commands and features to the system on a regular basis.
"The rate of change of the system is so great that a dismayingly large number of early sections [of the text] had to be modified while the rest were being written," Thompson and Ritchie noted in their introduction. "The unbounded effort required to stay up-to-date is best indicated by the fact that several of the programs described were written specifically to aid in preparation of this manual!"
That's why Unix timelines are fun to read—they give a sense of how quickly the system collaboratively evolved. But some of them either skip[4] or mention without explanation[5] a government decision that, in retrospect, paved the way not only for Unix, but perhaps for the open source movement as well: the 1956 Consent Decree between the United States Department of Justice and AT&T.
Read on for the full article at arstechnica.com[1].
URL[1]: http://arstechnica.com/tech-policy/news/2011/07/should-we-thank-for-feds-for-the-success-of-unix.ars
URL[2]: http://www.cs.bell-labs.com/who/dmr/1stEdman.html
URL[3]: http://cm.bell-labs.com/cm/cs/who/dmr/manintro.pdf
URL[4]: http://www.unix.org/what_is_unix/history_timeline.html
URL[5]: http://www.computerworld.com/s/article/9133628/Timeline_40_years_of_Unix
Robert Morris, Pioneer in Computer Security, Dies at 78 - NYTimes.com
Robert Morris, a cryptographer who helped developed the Unix computer operating system, which controls an increasing number of the world’s computers and touches almost every aspect of modern life, died on Sunday in Lebanon, N.H. He was 78.
He was an important contributor to early Unix security aspects and played a very fundamental role in the evolution of OS's from then on.
URL: http://www.nytimes.com/2011/06/30/technology/30morris.html?pagewanted=all