Search This Blog

Wednesday, February 13, 2013

DNA data storage

The latest developments in data storage turn to biology. DNA sequencing, not only allows for big data to be stored for thousands of years, but allows for a more compact encoding system based on the 4 letters in the DNA bases rather than than binary zeros and ones that have been used until now. Still, the cost makes it prohibitive for now, and the fact that the sequencing make the data impossible to update within DNA make it not quite the same as a flash drive.  Read about it in  Big Data: Tiny Storage

Friday, February 1, 2013

Smartphone signals for retail analytics


Shopping online offers customers convenience and price transparency, but it offers retailers even more. As Amazon has demonstrated in its successful model, the information it derives from its customer behavior online gives it insight that it uses to tailor its marketing to the individual. As your online browsing tracks, not only what you buy, but what you considered buying, the retailer gets to learn a lot more about you than the person who rings up your purchase at a store. How can a bricks-and-mortar establishment compete with that kind of analytic edge?

Post-Sandy Street Views

One thing about big data: it is not static. As there is always changes in a situation, what reflected reality one day can be out of date the next. That is especially true when a hurricane of the likes of Sandy sweeps through and alters the landscape and the structures built on it. 

But not everyone is pleased about updates that include images of their hurricane-ravaged homes. Read more at New Maps After Sandy

Thursday, January 24, 2013

That's big data entertainment!

In the past, the device held by someone watching television was usually a remote control. In the future, it is just as likely to be a mobile device. Starting next fall, television ratings will be measured in tweets as well as Nielsen numbers as social conversations are analyzed to calculate the reach of a program. Nielsen and Twitter announced the new form rating last month, but their partnership has been in the works for over a year. 

Snoopy was prophetic. 3-D movies have made a comeback. Consequently, movies today pack a lot more data per frame. But big data is also involved in the trend toward data streaming that is displacing discs and in the data on what people want to watch. 


Big data drives today’s movie industry, both in terms of the amount of data packed into each frame you see at theaters and in terms of video streaming online.  It’s what delivers 3-D effects in the theater and personalized recommendations to Netflix viewers.  And very big numbers ride on both.

In the past few years, 3-D movies have staged a comeback on a scale much greater than their  brief heyday in the 1950s. Adding in the 3-D effect adds "anywhere from 100% to 200% more data per frame," according to Jeff Denworth, vice president of marketing for DataDirect Networks (DDN). Denworth attributes the proliferation of present-day 3-D films to the huge success James Cameron had with the 3-D film "Avatar" in 2009, which packed a petabyte of data. 

"Avatar" cost about $237 million to produce, but it brought in more than ten times that amount. It earned the distinction of   IMDB identifies it as "the highest-grossing film of all time."  By the beginning of 2010, it had taken in $2,779,404,183.  A rash of 3-D films followed this success, and many did very well. According to iSuppli Market Intelligence (owned by IHS)  in 2011 3-D films brought in $7 billion at the box-office, 16 percent more than the previous year.


 The full figures for 2012 are not yet in, though they will likely be higher as the number of 3-D screens have gone up from about 9,000 in "2009 to 43,000 in by the third quarter" of 2012. One of the biggest draws of the year, Marvel’s 3D superhero flick, "The Avengers," grossed  $1,511,757,910 in 2012.  As 3-D has grown so common at the theater, movie-makers have to point to something else to distinguish their offering.


 "The Hobbit: An Unexpected Journey" had to do 3-D one better with its "brand new format High Frame Rate 3D (HFR 3D)." Instead of the 24 frames per second, which is the movie standard, it packs in 48. The advantage to the viewer, it claims, is that the greater number offers an experience "closer to what the human eye actually sees." Perhaps so, but quite a number of viewers were less than thrilled by the effect. Nevertheless, by December 29, 2012, "The Hobbit" had already taken in $600,508,000, according to IMDB figures.  


Big data is also the key to watching movies on the small screen. Instead of picking up a disc when they buy or rent a movie, people now can just have it come right to them. As Dan Cryan, senior principal analyst at HIS  observed, in 2012 Americans made  "a historic switch to Internet-based consumption, setting the stage for a worldwide migration from physical to online." 


 Estimates of  online movie payments for the US in 2012 are “3.4 billion views or transactions, up from 1.4 billion in 2011.  This form of video streaming is dominated by Netflix in the US, where it makes up "33% of peak period downstream traffic"  Amazon, Hulu, and HBO Go follow far behind at 1.8%, 1.5%, and .5% respectively.  It intends to keep its lead with the help of big data.


Netflix was the subject of a WSJ blog on using big data to improve streaming video.  Though Netflix still offers to mail out the DVDs people select for rental, more customers now opt for streaming.  In the interest of improving efficiency on that end, Netflix transferred its holdings to Amazon’s cloud. It also started using Hadoop, which enables it "to run massive data analyses, such as graphing traffic patterns for every type of device across multiple markets." That helps plan for improved data transmission and better understanding of the customer.


In addition to using big data solutions for delivery of content, Netflix applies algorithms to predict what their customers would likely want to watch next.  This type of data mining technology makes Netflix confident that it can handle hosting original content. In fact, it bet more than  $100 million on it; that’s the reported sum paid for the rights to two seasons of House of Cards, one of several original content series it plans on streaming


 As Netflix’s Chief Communications Officer, Jonathan Friedland, says, "We know what people watch on Netflix and we’re able with a high degree of confidence to understand how big a likely audience is for a given show based on people’s viewing habits." 


So what do you think? Is it possible to guarantee a hit with big data?


Thursday, January 17, 2013

Seeing stones for military, rescue, and security operations


What do JRR Tolkien, JP Morgan Chase, the military, and rescue workers have in common? Palantir.
"The Palantír" is the title of the 11th chapter of Tolkien’s The Two Towers. The name refers to the "seeing stones" that allow one to view what is happening elsewhere. In 2004, the name was also taken on by a company that develops software organization to extract meaning from various streams of data to combat terrorism, fraud, and disaster damage.


Palantir distinguishes its approach from data mining by calling it "data surfacing." Read more at 

From Sorcery to Surfacing Data


For more on big data used by the army, see  

National Safety in Big Numbers

 "You can't have a data Tower of Babel" in which each system keeps its data isolated from other systems, Patrick Dreher, a senior technical director at DRC, told Military Information Technology.His company worked with the US Army on the Rainmaker cloud-based intelligence system, which integrates different data models used by the intelligence community. "For example, when Afghan drug lords finance Taliban insurgents, data from one database can be combined with Taliban financing data from an Army database inside the cloud, allowing analysts to make timely, critical connections and stay one step ahead of insurgents."

Thursday, January 10, 2013

Big Data on the Final Frontier


Missions in space may come and go, but the National Aeronautics and Space Administration has always stuck to a mission of bringing in data.

(
One of its early achievements in this field was sending a spacecraft close enough to Venus to get accurate readings of its surface and atmosphere. On Dec. 14, 1962, the Mariner 2 spacecraft got within 34,762km (21,600 miles) of the planet. Over a 42-minute period, it was able to pick up many points of data that proved Venus, which had been thought of as Earth's twin, would be uninhabitable, with a surface temperature of 425°C (797°F) and a toxic atmosphere.
This picture (from NASA's site) of the data gathered in that mission is cropped. The paper showing the data that was gathered is actually much longer, as this uncropped version shows.

Back then, the data covered a roll of paper, but the data NASA handles today takes supercomputing power to process. As Nick Skytland wrote in NASA blog post in October:
In the time it took you to read this sentence, NASA gathered approximately 1.73 gigabytes of data from our nearly 100 currently active missions! We do this every hour, every day, every year -- and the collection rate is growing exponentially...
In our current missions, data is transferred with radio frequency, which is relatively slow. In the future, NASA will employ technology such as optical (laser) communication to increase the download and mean a 1000x increase in the volume of data. This is much more then we can handle today and this is what we are starting to prepare for now. We are planning missions today that will easily stream more 
[than] 24TB's a day. That's roughly 2.4 times the entire Library of Congress -- EVERY DAY. For one mission.
read more at 

Big Data on the Final Frontier

Sunday, January 6, 2013

Is your face your calling card?


Many books include pictures of the author on the back cover or inside the jacket. That is one thing I never bother to check when considering whether or not I want to read a book.  I  still don't really think about the author's appearance as I read. And I don't really think about my own as I write. 

I use a quill for my signature picture here, as well as on my other blogs. It also serves as  my profile photo  on Facebook, Google+ and Twitter. I feel it conveys what I am about more accurately -- in terms of my role as writer - than my photo would. Or maybe I'm just camera-shy.

On  the other hand, my actual photo does serve as my profile picture for the UBM boards on which I write. The policy there, as it is for many newspapers, is to require a photo for the writers. Those who comment only and don't blog can get away with using any picture they like for their profile photo or just use the default picture if they don't bother to upload one of their own. 


Once I had my picture posted in that way, I put it in for my LinkedIn profile, as well. It seemed more consistent to have the same picture represent me there. Also the more standard practice on LI is to use an actual photo than a representational picture.  I still can't see attaching a photo to a resume, though anyone who wishes to find my photo simply has to do an online search to find one in a fraction of a second.


While the net dooes tend to attach author faces to content,  I don't believe I am more drawn to articles that feature faces.I must be  in the minority, though, because I'm certain that those who demand faces find that they are effective at drawing more audience interest. 


. What do you think about  the face as calling card?