Skip to content

Reading A Speech

August 5, 2012


How to read a speech, if you must

George Peterson is the President of the Georgia Institute of Technology, but he likes to be called “Bud.” He is our eleventh president, and oversees all aspects of our university with its 21,000 students. His background is in mechanical engineering, with a PhD from Texas A&M University. While much of his career has been in administration, he has done seminal work in many areas. One is a famous paper presented at the conference named coolzone: “where people go when the heat is on.” Pretty neat name for a conference. In recognition of his many achievements he is currently on the National Science Board, which governs the NSF.

Today I want to relate some experiences that I just had in giving the commencement address at this summer’s graduation at Tech.

I had this honor since I was the faculty selected to carry the “mace.” The mace is a special symbolic object carried by the faculty member who leads the president’s party onto the stage at the Georgia Tech graduation. As with most places, usually outside guests give the address, but summer is different. Our main commencements are winter and spring, so we use someone local to give the summer one. This year it was I.

I thought I would share some general comments about reading a speech rather than just speaking, and share the actual speech that I read.

Reading A Speech

Here are a couple of ideas that helped me give my written speech.

{\bullet } I got professional help from an expert. I wrote the first draft of the speech and the last draft, but Patti Futrell helped tremendously in changing parts of it. Having an expert look over your speech is valuable. I also had my wife Judith Norback, who is a researcher at Tech, and her specialty is communication. One of her rules for students is not to memorize or read a speech. However, in giving this type of address for many reasons it has to be written ahead of time, and has to be read.

{\bullet } I gave the speech at night in a fairly dark Fox theatre. Those of us on the stage could not see anyone in audience because of the extremely powerful lights. They were directed right at our faces and I could not see anything. In an unexpected way I think this helped greatly in making me less nervous. If you can’t see ‘em, then you do not think about the thousands of people who are listening to your speech.

{\bullet } I also got an origami type suggestion from President Peterson that was invaluable. Before I gave the speech I searched for advice on reading speeches, but none mentioned his simple but cool idea. I should have asked him where he learned the trick.

The speech is in a loose-leaf book that is on the stage. When you go up you can read from the book. Of course eveyr now and then you must turn the page to get to the next page. The last think you want is to have any trouble in separating the pages: you know how paper can stick together.

The president took me up the the podium and showed me the speech book. He then said that one trick is to cringle up each page of my speech. In this way they would not stick together, I would have no trouble in turning pages, and one worry would be gone. I did cringle my pages, and it worked perfectly. What a cool idea.

The Speech

President Peterson, administration and faculty, distinguished guests, friends and loving families, and most of all, the Georgia Tech Class of 2012. I am deeply honored to be here, and it is a tremendous privilege to be addressing you today.

Congratulations, you made it. Wonderful.

Your class is of course potentially one of the most notable classes to ever graduate from Tech. No I am not talking about the world situation, nor about our difficult economy, but about the Mayan prediction. Yes the 2012 class could be one of the last classes ever.

I certainly hope not, but I checked the source of all knowledge—Wikipedia—and 21 December 2012 is the end-date of a 5,125-year-long cycle in the Mesoamerican Long Count calendar.

One interpretation of this transition is that this date marks the start of time in which Earth and its inhabitants may undergo a positive physical or spiritual transformation, and that 2012 may mark the beginning of a new era. Others suggest that the 2012 date marks the end of the world or a similar catastrophe.

Scenarios suggested for the end of the world include the arrival of the next solar maximum, or Earth’s collision with an object such as

  • a black hole,
  • a passing asteroid,
  • or a planet called “Nibiru”.

(How, you might ask, do they know the name of the planet–I have no idea.) Of course, there’s always the chance that none of these events will occur. It just takes one person to edit the facts in Wikipedia and everything could change! I believe and hope that all the negative predictions are wrong, but they remain in the future and we will see who is right in exactly 20 weeks.

This prediction made me think about how we do predict the future, and how poor we often are at doing it. There are many methods for predicting the future. I am reminded of an explanation by Scott Adams, the creator of the “Dilbert” cartoon. He said:

For example, you can read

  • horoscopes,
  • tea leaves,
  • tarot cards,
  • or crystal balls.

Collectively, these methods are known as `nutty methods.’ Or you can put well-researched facts into sophisticated computer models, more commonly referred to as `a complete waste of time.’

I certainly hope he was limiting his statement to predicting the future rather than doing research in computer science, because that is my life’s work.

With this in mind I thought I might look at the past in order to try and predict the future. So here are a few things that happened exactly fifty years ago. I know for the graduates fifty years ago is forever, but it seemed like a nice round number, so let’s look at 1962:

{\bullet} Wilt Chamberlain scores 100 points in a single NBA basketball game. And this is before there were three-point shots.

{\bullet} The Beatles release their first single for EMI, “Love Me Do.”

{\bullet} The term “personal computer” is first mentioned by the media.

{\bullet} The first Walmart store, then known as Wal-hyphen-Mart (which is still the corporate name), opens for business in Rogers, Arkansas.

{\bullet} American advertising man Martin Speckter invents the interrobang, a new English-language punctuation mark that combines a question mark and an exclamation mark.

The last one is personal: I met Martin Speckter, since at the time my dad, Jack Lipton, was the art director for Speckter’s ad agency. Not surprisingly my dad designed the first rendering of the new symbol for his boss—if only they had seen it become a heavily used symbol. Oh well. It is Unicode U+203D and HTML glyph 8253, for those engineers here that are taking notes.

Rather than hear my thoughts on predictions, I thought you might enjoy words of experts on predictions of the future. These are those who—well—were a bit off the mark.

  • “The telephone has too many shortcomings to be seriously considered as a means of communication.” –Western Union internal memo, 1876.

  • “Drill for Oil? You mean drill in the ground to try and find oil? You’re crazy.” –Response reported by Edwin Drake as he tried to hire workmen who knew oil just bubbled out of the ground, 1895.
  • “Everything that can be invented has been invented.” –Charles H. Duell, Commissioner, U.S. Office of Patents, 1899.
  • “Airplanes are interesting toys but of no military value.” –Marshal Ferdinand Foch (who become the supreme allied commander in World War I), 1911.
  • “The wireless music box has no imaginable commercial value. Who would pay for a message sent to nobody in particular?” –David Sarnoff’s associates in response to his urgings for investment in the radio in the 1920s.
  • “Who the hell wants to hear actors talk?” –H.M. Warner, Warner Brothers, 1927
  • “I think there is a world market for maybe five computers.” –Thomas Watson, chairman of IBM, 1949
  • “Computers in the future may weigh no more than 1.5 tons.” –Popular Mechanics, 1949 Actually quite correct, computers are much lighter than that. Much.
  • “I have travelled the length and breadth of this country and talked with the best people, and I can assure you that data processing is a fad that won’t last out the year.” –Editor in charge of business books for Prentice Hall, 1957.
  • “There is no reason in the world anyone would want a computer in their home. No reason.” –Ken Olsen, Chairman, DEC, 1977

  • “640K of RAM ought to be enough for anybody.” That is 640,000 bytes. –Bill Gates, 1981

But that statement is not Bill Gates’ legacy. He of course co-founded Microsoft, a leading multinational company for more than 20 years. He became the youngest multi-billionaire in U.S. history. He established the Bill and Melinda Gates Foundation in 1994, and since then it has provided more than $26 billion in grants to help people live healthy and productive lives throughout the world. He stayed up with technology, and assembled teams of people who would create new technologies and bring innovations to the manufacturing floor. While it’s safe to say he might not be great at predicting the future, he’s really made an impact in changing it for probably billions of people.

You don’t have to be on the Forbes list of the world’s richest people to make a difference.

As new Georgia Tech graduates, you are equipped with the knowledge and skills to change the world too. Many of you didn’t wait until graduation to make a big impact.

Just one example is the Service Learning Initiative, which brings together professors and students from several disciplines to apply science and technology to address pressing societal problems. It is in response to a growing student interest in socially relevant projects that expand and leverage their skills.

Georgia Tech students are helping in schools across the street from the Institute, and they’re traveling halfway around the world to help remote villages with clean drinking water. They’re designing solar toilets that are saving lives in developing countries. They’re creating inexpensive devices that can be used to quickly diagnose pneumonia, saving the lives of thousands of children in developing countries.

Many of you have been involved in these and other projects to improve the human condition through science and technology. Now as Georgia Tech graduates, you’ve been equipped to do even greater things in the future.

I opened with a dire look at the future from a Mayan calendar more than 5,000 years ago. If you were to believe that as our destiny, you might develop a defeatist attitude. President John F. Kennedy once said

We have come too far, we have sacrificed too much, to disdain the future now.

Every generation has had its challenges. Yours is no different. The difference is that as graduates from one of the nation’s best research universities, you are equipped to develop innovative solutions that will impact not only your generation, but those to come.

Alan Kay, a famous computer scientist who spoke here at a previous commencement, is famous for saying:

The best way to predict the future is to invent it.

I would like to charge this great class to do just that. Forget all I said today but this one quote. Do not look back, do not predict, but act and make the future.

Finally, thank you for this honor. And for the record, I am confident that there will be thousands of other Georgia Tech graduates in the years to come. If we want to take anything from that Mayan prediction, let’s focus on 2012 marking the beginning of a new era.One that your class helps make happen.

Thank you again.

Open Problems

One is, clearly what about the Mayan prediction? What will happen at the end of this year. Peterson joked to us, before my speech, that if the world was really ending then, that would solve our continuing budget issues. Another is, are there limiting predictions being made today that our graduates might help make look equally naive? For example, is “Big Data” a passing fad?

About these ads
10 Comments leave one →
  1. Serge permalink
    August 6, 2012 6:50 pm

    > “The best way to predict the future is to invent it.”

    In computer science, this can be rephrased as: the only way to know whether an optimal algorithm exists is to try to design the best possible approximation of it.

    In quantum physics, the exact position of a particle can be known only up to some probability. Similarly in computer science, the existence of an optimal algorithm can be approached only constructively by trial and error.

    In other terms, my thesis is: it is no more possible to know whether an optimal algorithm – such as a polynomial algorithm for SAT – exists than it is to know for sure the exact position of a particle. To some extent, designing an algorithm can be compared to measuring the position of a particle.

    • Serge permalink
      August 7, 2012 8:11 am

      If you replace probability by complexity, you can develop a theory of computational complexity based on the model of quantum physics. An event that’s unlikely to be observed in quantum physics will become a function that’s hard to compute or a fact that’s hard to prove.

  2. August 6, 2012 8:15 pm

    “Meg, you need a quantum computer like you need a hole in your head. Oh well.”
    Richard Lipton (2012)

  3. August 6, 2012 11:37 pm

    I tried your advice about the “cringling” the pages of my speech. Well, I’m still not sure what was meant by “cringle”, but I figured it meant “crinkle” or “crumple”. I wound up on stage with my skinny jean pockets stuffed full of the crumpled pages of my speech. Moreover, I hadn’t put them in there in any order, so I was just gonna have to find them as I needed them. Somewhere in the middle of my speech, while I was trying for the umpteenth time that evening to cram my sweaty hand down that already soaked, flattened, pocket it dawned on me that you must’ve meant crinkle after all! :)

  4. Hamid permalink
    August 7, 2012 1:53 am

    What about algorithmic cosmic Galois theory although even algorithmic Galois theory is most probably still within P=NP problem? Didn’t Godel advise von Neumann for/against doing this noncommutative/commutative geometrically in the lost letter? Imagine algorithmic cosmic Galois-Connes theory on the quintics and beyond. Of course, I somehow peeked in thorough an object called VPN the password of which was beyond my reach!

  5. Hamid permalink
    August 7, 2012 2:13 am

    Algorithmic cosmic Galois theory on the conics, cubics and, quartics might also do the job at least as far as the generalization of proof of cosmic Fermat’s last theorem is concerned, of course.

  6. John Sidles permalink
    August 7, 2012 5:52 am

    Dick asks  Are there limiting predictions being made today that our graduates might help make look equally naive?

    Dick, we can look to this year’s coolZONE-12 conference for a broad-spectrum answer:

    Using computers to go where experiments cannot

    For decades, the steady increase in modern computing technology has allowed for faster as well as larger, more complex simulations. With all this computing power, there are still only two areas in which a numerical simulation is better than an experimental data: (1) quick, reliable (5%-20% error) simulations for design optimization, and (2) massive resolution, highly accurate (< 1% error) simulations.

    In both cases the computer is going where experiments cannot. In quick simulation case, the simulations are faster and cheaper than any experiment, yielding results (hopefully trustworthy results) fast enough to be included in a design cycle. In the highly-resolved case, the resolution is far beyond any experimental measurements.

    Understood broadly, the coolZone vision:

    • encompasses quantum dynamics as well as classical dynamics, and

    • encompasses biological dynamics as well as engineered dynamics, and

    • encompasses information flow and sensing as well as energy flow, and

    • has no known fundamental obstructions to continue exponential progress.

    If we extend these trends for even 5-10 years, isn’t it true that the resulting capabilities of systems-level engineering encompass most of what passes for fundamental research at universities like Georgia Tech?

    It might perhaps be argued that departments like Mathematics and Literature are insulated from these trends. But these departments have their own revolutions-in-progress, in naturality and narrative.

    It is these three confluent revolutions — in naturality, numerics, and narrative — that are changing forever students’ ideals of what modern research universities are and do.

  7. August 7, 2012 8:40 am

    Possibly typo: “course eveyr now”

  8. Serge permalink
    August 7, 2012 9:11 am

    > American advertising man Martin Speckter invents the interrobang, a new English-language punctuation mark that combines a question mark and an exclamation mark.

    I think there’s a time for being surprised and a time for trying to find an answer. In a similar fashion, some constructed languages – such as Esperanto and Lojban – have grammatical constructs that allow for interro-imperative sentences, although you don’t see such things in natural languages. The !? pattern is more reminiscent of cartoons than of anything else… :)

  9. August 10, 2012 5:49 pm

    Apparently, it is a historical myth that the commissioner of America’s Patent Office recommended the Office be abolished in 1899. Rather, he asked for a funding increase because of the heavy load of patent applications, arguing that anyone who would deny him additional funding must believe that “everything that can be invented has been invented”.

    Source:

    http://www.economist.com/category/print-sections/letters?page=20

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Follow

Get every new post delivered to your Inbox.

Join 2,035 other followers