Accenture Shaping the Future Forum 2012

This is from my notes, taken with the best of my abilities, during Dr. Srini Devadas’ talk last July 26, 2012.

Programming the Future

“Look to the future because that is where you will spend the rest of your life.” – George Burns

Accenture Shaping the Future Forum 2012 - Can Computer Software Help
Can Computer Software Help?

Talk Outline:

  • From the first programs to where we are now
  • Computing paradigms for the next 40 years
  • Where software needs to be in 2050

Calculation, Computers and the First Programs

Chinese Abacus
Abacus. Image from Wikipedia

Calculating Machine

Replica Schickard Calculating Clock
Replica of Schickard’s Calculating Clock. Image from Wikipedia.

Wilhelm Schickard drew the Calculating Clock, which is the first calculating machine, and shared this in a letter to Johannes Kepler. It was destroyed in a fire during 1624 before Kepler could actually use it.


Problem with Schickard & Pascal’s machines: limited capability to carry out a linked sequence of calculations (needed to transcribe and enter all immediate results).

Charles Babbage

  • created the Difference Engine (1823), which can evaluate polynomials.
  • In 1833, he worked on the Analytical Engine.
  • Babbage died in 1871.

The First Programmer – Ada Byron a.k.a. Lady Lovelace

  • Luigi Menabrea published Babbage’s lectures in Italy which Ada translated into English.
  • Her works contained an algorithm to compute the Bernoulli numbers.

Harvard Mark I & II

1994, IBM Labs (Endicott) – Howard Aiken (Professor of Physics) developed some electromagnetic-controlled relays -> Harvard Mark I.

Grace Murray Hopper – wrote the first compiler for Harvard Mark II and created the computer language COBOL.

The First Bug?

First Computer Bug
Sept. 9, 1947 – a moth trapped between points of relay #70 in the Mark II. Operators wrote that they had “debugged” it. Image from


  • Electrical Numerical Integrator And Calculator was developed by John Presper Eckert & John Mauchly in 1943 – 1945 (World War II)
  • 1st completely electronic calculator
  • 20% accuracy
  • stats: 30 tons, 72 square meters, 200 kW
  • reads 120 cards per minute

50 Years of Hardware Innovation

50 Years of Hardware Innovation
50 Years of Hardware Innovation

Today: 100 processors on single chip, running at ~1GHz, requires ~20W of power = 10 million times faster than ENIAC.

50 Years of Software Innovation

50 Years of Software Innovation
50 Years of Software Innovation

Fun fact: Chevy Volt car has 10 million lines of code created in 29 months.


What About the Next 40 Years?

What about the next 40 years? Three Computing Paradigms
Three Computing Paradigms

Programming for Everyone

Computing Paradigm: Programming for Everyone
Computing Paradigm: Programming for Everyone


Game made from Scratch. You know what I mean 😛
  • Programming for Kindergarteners
Programming for Kindergarteners
Programming for Kindergarteners
  • MIT App Inventor (brainchild of Prof. Hal Abelson)
  • Let’s Make Driving Safe
  • ComPal (Combur Urine Test Analyzer)
  • Going All the Way: Programming in English by Rinard et al

BIG Data

Big Data is Everywhere
Data is Everywhere!

For example: Web Analytics

Large web enterprises: thousand of servers, millions of users, and terabytes per day of “click data”

No just simple reporting: e.g. in real time, determine what users are likely to do next, or what ad to serve them, or which user they are most similar to.

Existing analytics systems either: do not scale to required volumes OR do no provide required sophistication.

BIG DATA Challenge

Big Data Challenge
Big data is data is too big, too fast, too hard.

Meeting the Big Data Challenge


  • Big Data Algorithms, Piotr Indyk et al
    • Challenge: develop faster algorithm for processing massive data sets
  • Understanding Images: Antonio Torralba et al
    • Challenge: understanding visual scenes
  • Detecting Defaults: Andrew W. Lo et al
    • Challenge: consumer credit risk analysisand forecasting
    • Approach: machine learning (detects potential defaults more accurately than FICO credit scores.)

Crowds to Cloud

In the future, billions of people will use devices to connect to trillions of processors in the cloud.


  • Cartel, Balakrishnan and Madden
    • CarTel system is an end-to-end exploration of techniques to collect, store, and manage sensor data from cars, phones and people.
  • TwitInfo: Adam Marcus et al
    • Challenge: raw Twitter interface not ideal to summarize interesting events from millions of tweets per day
    • How do we aggregate to understand what people are saying, how they feel, where they are, and what they are linking to? Applications: Journalism, Marketing, Finance..
  • Projects in Human Computation: Rob Miller et al
    • Challenge: real-time crowd systems with high quality, low latency and low cost to tackle hard AI problems
    • Examples:
      • VizWiz lets blind people ask visual questions of a crowd (take picture, speak a question, get an answer in less than a minute)
      • Adrenaline is a smart camera shutter (take a 10-second video, get crowd to pick the best still-picture from it in less than 10 seconds)

Best approach to tackling BIG DATA is to combine the strengths of both human and machine.


The World in 2050

“The best way to predict the future is to invent it.” – Alan Key

Year 2050
Do we allow this?

What computer scientist are doing today to prevent these:

  • Climate Change
Tackling Climate Change
Predict “unpredictable” weather patterns.
  • Financial Markets Causality Connections 1994 – 1996 by Andrew W. Lo
    • Factors that caused recession in 2006 – 2008: insurance companies densely connected to other systems.
  • Avoiding Financial Breakdown
    • track the dependence between financial sectors EVERY DAY and include all financial organizations.
    • program automated measures to track crowded trades, dependencies and other risk exposures.
  • Solving “Big City” Problems: MIT’s Autonomous Vehicle
  • Towards Personal Systems Genomics
    • What we can do today using “23 & Me” – track genealogy
    • Fact: 45 – 71% of disease risk are attributable to genetics, the rest is to the environment.
    • Epigenomics – functionally relevant modifications to the genome that do not involve a change in the nucleotide sequence.
  1. As a political science major, I am right there with you. I do not know a lot about the scientific process that goes into making a computer. I also agree that the increase in demand World War II created allowed for a speedier innovation in computers. An increase in demand was inevitable, but it made the demand come sooner than expected. I think it took a long time for the masses to understand the uses of computers or realized they could utilize them for personal use. As we learned in lecture, people called ENIAC the “big brain.” They did not understand how it worked; it was like a miracle machine for them. I disagree with you when you say that people soon realized that computers could have a practical application in their life. I think they knew computers were too expensive and large for their personal use. I think this realization did not occur until computers were made more compact.

Let's Talk!

This site uses Akismet to reduce spam. Learn how your comment data is processed.