Research

URL's

http://www.americanhistory.abc-clio.com/Search/Display.aspx?categoryid=23&entryid=391917&searchtext=movies&type=simple&option=all

http://www.americanhistory.abc-clio.com/Search/Display.aspx?categoryid=23&searchtext=movies&type=simple&option=all&entryid=379980&issublink=true&fromsearch=false

http://www.americanhistory.abc-clio.com/Search/Display.aspx?categoryid=23&searchtext=movies&type=simple&option=all&entryid=263202&issublink=true&fromsearch=false

http://digitalstorytelling.coe.uh.edu/

http://www.jstor.org/stable/40165601?seq=2

http://imoberg.com/files/World_of_Digital_Storytelling_The_Ohler_J._.pdf

Research Articles

=Digital Movies Are the Future of the Film Industry=

Steven Ascher, "The Digital Revolution," eJournalUSA, June 2007. Reproduced by permission of author.
 * Table of Contents:** [|Further Readings]

"... it's astounding to see how much it has changed the way movies are made, the stories they tell, where they're shown, how much they cost, and who's watching."

Steven Ascher is an Academy Award nominated director whose films include //Troublesome Creek// (winner of Sundance Grand Jury Prize) and //So Much So Fast//. He is author of //The Filmmaker's Handbook, a Comprehensive Guide for the Digital Age //, a bestselling text. In the following viewpoint, Ascher claims that the digital era is transforming how motion pictures are made, distributed, and seen. For example, the author maintains that digital video technology is reducing production costs, thus allowing filmmakers to bypass major studios and inexpensively produce and distribute movies through the Internet to specialized audiences. Ascher further contends that from digital projectors to high-definition televisions to tiny MP3 player screens, movies are reaching audiences in many new ways. As you read, consider the following questions: In the history of motion pictures there have been decisive moments when a new technology changed everything. In 1927 //The Jazz Singer//—the first "talkie"—marked the beginning of the sound era. Suddenly, as comically portrayed in //Singin' in the Rain//, silent film stars were out and a new type of star and a new type of story were in, changing how movies were written, filmed and shown. Today, digital technology is driving a revolution that's even more earthshaking. Children who have grown up in the Internet era don't realize how seismic the changes have been. Movies—all kinds of media, really—will never be the same. What digital means technically is that pictures and sounds are converted to digital data (ones and zeros) that can be stored, manipulated and transmitted by computers. Once in digital form, a world of possibilities opens up. The digital era in movies began in the 1980s, but picked up momentum around 1990. From the beginning, digital technology was used to create new kinds of images. George Lucas's company, Industrial Light and Magic pioneered astonishing visual effects that made the most fantastic space stories look stunningly realistic. With programs like Photoshop we could now digitally alter pictures—say, to remove a person or add a building—which changed our basic understanding of photographed reality. In the digital era, statements like "pictures don't lie" and "seeing is believing" are clearly untrue. Digital editing systems helped shape new filmmaking styles and techniques, such as the use of very short shots, graphics that fly around the screen and objects that seamlessly transform (morph) into other objects. The look of most TV commercials today would not be possible without digital tools. The 1990s brought an explosion in digital video (DV) and the now-familiar miniDV camcorders that give amateurs the ability to shoot and edit inexpensive, very good-quality video. Independent filmmakers seized DV cameras and used them to make movies that were suddenly being shown on television and at prestigious film festivals like Sundance. In the traditional Hollywood production model, films are shot with big 35mm film cameras with big crews to handle them. While DV is not up to 35mm quality, it's good enough and cheap enough that a wide range of fiction and documentary projects can be made in DV that would have been impossible, or impossibly expensive, before. As digital video took off, so did the Web. At first, Hollywood didn't know what to do with it. //The Blair Witch Project//, a 1999 low-budget thriller shot with small-format video cameras, is credited as the first movie to exploit the Internet's marketing power. By posting hints on the Web that the horror in the film was real, the producers sparked intense debate, helping propel the film to a $248 million worldwide gross. Today, Web sites, blogs, online reviews and discussions on sites like MySpace.com are essential elements in building "buzz" for a new film. The Web opens the door to a new model of filmmaking and distribution. The majority of movies are created and distributed by large corporations—such as film studios, television broadcasters or big distribution companies. However, the Web makes it possible to produce a movie for a specialized audience and sell DVDs (yet another digital technology) directly to that audience, bypassing the gatekeepers who would have likely rejected the project for lack of broad appeal. Distribution expert, Peter Broderick notes that //Reversal//, a drama about high school wrestling, has never been shown in theaters, on TV or even offered in video stores, but has generated over a million dollars in sales of DVDs and merchandising over the Web. //In The Long Tail: Why the Future of Business is Selling Less of More//, author Chris Anderson describes how the Web enables producers and distributors to target niche audiences with products that don't sell in high enough volume for traditional retail outlets. The ability to make a profit while producing smaller and more unusual types of productions increases as we move away from selling or renting physical objects like DVDs and toward downloading electronic files. Meanwhile, recent advances in high definition television (HDTV) have brought a quantum leap forward in picture and sound quality. If you've been to an electronics store lately, you know how incredibly clear, vivid and downright huge the new flat-panel screens are. Every frame of digital video is made up of tiny dots of light called pixels; the more pixels, the sharper and better the image, especially when shown on a big screen. Traditional, standard definition video uses about 345,000 pixels for each frame; the best high definition systems use about two million. Once you've seen a beautifully shot, widescreen movie in high definition, you never want to go back to watching old-fashioned standard def again. High definition is transforming Hollywood movies and TV shows (using camera technology pioneered by, once again, George Lucas). Many types of projects that used to be shot on film are now shot in high definition to save time and money; the quality is now high enough that audiences usually can't tell the difference. Almost every movie today goes through a digital stage at some point in its production. The Digital Cinema Initiative was put forth by a group of studios to bring digital technology all the way to theaters. Currently, when you go to your local multiplex, chances are you're watching a movie being shown with a film projector. New "4K" digital projectors use almost nine million pixels and create a gorgeous picture that never gets scratched or dirty. Theaters have resisted investing in the expensive machines, but because studios can save millions by not having to manufacture and ship heavy film prints, they may eventually subsidize the equipment. However, Hollywood is terrified of the potential for piracy when their new releases come out in digital form. Piracy is already an enormous problem. When the latest James Bond film opened recently in foreign theaters, the pirated DVD was already available on the street. But just as theaters are poised to move into the digital era, consumers have an exploding number of options for viewing movies on giant flat-panel screen in their living rooms, on smaller computer screens at their desks, and on tiny iPod or cellphone screens on the street. Digital television—already available with new high definition and standard definition channels—will completely replace traditional analog TV in the United States on February 17, 2009. Between video-on-demand, downloads, TiVO, and Webcasts, we'll soon be able to see almost anything, anywhere, anytime. Will this mean the death of one of the great worldwide traditions—going to a theater to watch a movie surrounded by an audience that's laughing and crying along with you? Yet again, we look to George Lucas as a bellwether. Because releasing a movie theatrically is incredibly risky and expensive, studios are driven to a blockbuster mentality, creating product for the widest possible appeal (or, depending on how you see it, the lowest common denominator). Even so, most films lose money in the theaters. Lucas, the man behind more blockbusters than almost anyone, told //Daily Variety//, "We don't want to make movies. We're about to get into television." Instead of spending $100 million to make a single film and another $100 million to distribute it to theaters, he said, he could make fifty to sixty films for TV and Internet distribution. As for future audiences going to theaters, Lucas said, "I don't think that's going to be a habit anymore." When you consider that digital technology is at its heart simply a way to convert movies to a string of ones and zeroes, it's astounding to see how much it has changed the way movies are made, the stories they tell, where they're shown, how much they cost, and who's watching. Stand by for further developments.
 * 1) In Ascher's view, how did digital technology initially change the film industry?
 * 2) In the author's opinion, why was //The Blair Witch Project// a huge success?
 * 3) According to Ascher, what is filmmaker George Lucas's view of the future of movie theaters?

lllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllll Prince William teachers learn digital storytelling. Manassas Journal Messenger (Manassas, VA) (March 4, 2009) : pNA. Reading Level (Lexile): N/A.

Full Text :
COPYRIGHT 2009 McClatchy-Tribune Information Services Byline: Bennie Scarton Jr. Mar. 4--Four educators from Prince William County Public Schools joined peers from across Virginia recently to learn how to teach their student to create digital stories using multimedia tools. Digital storytelling is the art of combining music, video and photos with the author's voice to create a multimedia narrative that engages viewers in dynamic ways. The interactive professional development event, "A Day of Discovery," was conducted by national experts at the Virginia Beach Conference Center in Virginia Beach" and was sponsored by Discovery Education, a division of Discovery Communications, whose networks include Discovery Channel, Animal Planet and Science Channel. Attending the event from the county were Netia Elam, technology resource teacher from Bull Run Middle School; Jill Warner, teacher, Bull Run Middle School; Valaina Maher, instructional technology resource teacher, Bennett Elementary School; and Sarah Stere, teacher, Bennett Elementary School. Digital storytelling is used by the educators to introduce or reinforce the power of writing, and encourage students to explore the strength of personal expression and creativity with digital tools. According to Discovery Education's Matt Monjan, a presenter at the Day of Discovery, "Veteran teachers across Virginia are correct in their belief that 21st century students are different. These students have different attention spans, new ways of social networking and different skill sets. For this group, digital storytelling is a very effective teaching tool that engages student in writing and the language arts." Following the seminar, the educators will continue to be supported in their efforts to improve student achievement with digital media and technology by the Discovery Educator Network, which offers a range of technology training. Staff writer Bennie Scarton Jr. can be reached at 703-369-6707. To see more of the News & Messenger or to subscribe to the newspaper, go to http://www.insidenova.com/. Copyright (c) 2009, News & Messenger, Manassas, Va. Distributed by McClatchy-Tribune Information Services. For reprints, email tmsreprints@permissionsgroup.com, call 800-374-7985 or 847-635-6550, send a fax to 847-635-6968, or write to The Permissions Group Inc., 1247 Milwaukee Ave., Suite 303, Glenview, IL 60025, USA.

lllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllll

Digital education.(educational advances because of information technology boom). Chris Dede, and Eamon Kelly. Issues in Science and Technology 22. 1 (Fall 2005) : p10(3). HENRY KELLY'S "GAMES, COOKIES, and the Future of Education" (Issues, Summer 2005) provides an excellent synthesis of challenges and opportunities posed by technology-based advances in personalized entertainment and services. An aspect of this situation deserves further discussion: Children who use new media extensively are coming to school with different and sophisticated learning strengths and styles. Rapid advances in information technology have reshaped the learning styles of many students. For example, the Web, by its nature, rewards comparing multiple sources of information that are individually incomplete and collectively inconsistent. This induces learning based on seeking, sieving, and synthesizing, rather than on assimilating a single "validated" source of knowledge as from books, television, or a professor lecturing. Also, digital media and interfaces encourage multitasking. Many teenagers now do their homework by simultaneously skimming the textbook, listening to a MP3 music player, receiving and sending email, using a Web browser, and conversing with classmates via instant messaging. Whether multitasking results in a superficial, easily distracted style of gaining information or a sophisticated form of synthesizing new insights depends on the ways in which it is used. Another illustration is "Napsterism": the recombining of others' designs into individual, personally tailored configurations. Increasingly, students want educational products and services tailored to their individual needs rather than one-size-fits-all courses of fixed length, content, and pedagogy. Whether this individualization of educational products is effective or ineffective depends both on the insight with which learners assess their needs and desires and on the degree to which institutions provide quality customized services, rather than Frankenstein-like mixtures of learning modules. During the next decade, three complementary interfaces to information technology will shape how people learn. The growing prevalence of interfaces with virtual environments and ubiquitous computing is beginning to foster neomillennial learning styles. These include (1) fluency in multiple media, valuing each for the types of communication, activities, experiences, and expressions it empowers; (2) learning based on collectively seeking, sieving, and synthesizing experiences; (3) active learning based on experience (real and simulated) that includes frequent opportunities for reflection by communities of practice; and (4) expression through nonlinear associational webs of representations rather than linear "stories" (such as authoring a simulation and a Web page to express understanding, rather than a paper). All these shifts in learning styles have a variety of implications for instructional design, using media that engage students' interests and build on strengths from their leisure activities outside of classrooms. CHRIS DEDE Wirth Professor of Learning Technologies Harvard University Graduate School of Education Cambridge, MA 02138 Chris_Dede@harvard.edu HENRY KELLY'S ARTICLE PROVIDES readers with a timely and comprehensive look at what is needed to address glaring shortfalls in the U.S. education system. The article underscores the lack of investment in R & D on new educational techniques that would use the up-to-date technology currently available. By conveying how increased investment in educational R & D can improve teaching and learning, Kelly is making an excellent case for the adoption of the Digital Opportunity Investment Trust (DO IT) legislation. Although the article notes the low rankings of U.S. students as compared to international students in recent studies, not enough emphasis is placed on the fact that our students are performing alarmingly poorly in the fields of math and science. A study conducted in 2004 found that U.S. students ranked 24th in math literacy and 26th in problem-solving among 41 participating nations and concluded that U.S. students "did not measure up to the international average in mathematics literacy and problem-solving skills" (Program for International Student Assessment at www.pisa.oecd.org). Additionally, U.S. students are becoming less interested in math and science. There has been a steady decrease in bachelor degrees earned in mathematics and engineering in U.S. universities during the past decade. While our students are not meeting global standards in mathematics and science and are losing interest in these subjects altogether, the United States has become increasingly reliant on foreign talent in these fields. In 2000, 38% of all U.S. science and engineering occupations at the doctoral level were filled by foreign-born scientists (up from 24% in 1990). Filling these critical occupations with foreign talent has become a more complex issue with the war on terror and as global competition for the best and the brightest in science and engineering increases dramatically. During the 1990s, the Organization for Economic Cooperation and Development saw a 23% increase in researchers, whereas the United States saw only an 11% increase. There is a critical need to change these trends in math and science. We need to build up domestic talent and interest in these crucial areas and provide necessary incentives to attract foreign talent. Increased investment in R & D and educational technology, as outlined in Kelly's article, can begin to address this need. Kelly's article highlights efforts by the National Science Foundation, Department of Education, Department of Defense, and Department of Homeland Security to improve training and educational technologies, but does not stress enough that DO IT is a comprehensive effort that will research and improve teaching and learning techniques that can permeate all U.S. educational institutions. It is important to stress that DO IT legislation would help to fill the current market failure that Kelly mentions ("Conventional markets have failed to stimulate the research and testing needed to exploit the opportunities in education"). DO IT will foster collaboration among educators, cognitive scientists, and computer scientists to research and develop the most effective methods of teaching and learning, using today's technologies. DO IT will help to ensure that the U.S. education system does not continue to fall behind all other sectors and nations that have embraced the potential of technology. We are facing a crisis in public education and math and science education; Kelly presents an excellent case for the need to increase educational R & D and succeeds in demonstrating how currently underutilized technologies can improve the learning process. EAMON KELLY Distinguished Professor in International Development President Emeritus Tulane University New Orleans, Louisiana
 * The familiar "world-to-the-desk-top" interface, providing access to distant experts and archives, enabling collaborations, mentoring relationships, and virtual communities of practice. This interface is evolving through initiatives such as Internet2.
 * "Alice-in-Wonderland" multiuser virtual environment (MUVE) interfaces, in which participants' avatars interact with computer-based agents and digital artifacts in virtual contexts. The initial stages of studies on shared virtual environments are characterized by advances in Internet games and work in virtual reality.
 * Interfaces for "ubiquitous computing," in which mobile wireless devices infuse virtual resources as we move through the real world. The early stages of "augmented reality" interfaces are characterized by research on the role of "smart objects" and "intelligent contexts" in learning and doing.

llllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllll Classroom Revolution Students of almost every age are far ahead of their teachers in computer literacy. This is especially true of younger kids with younger parents. So how is this digital revolution affecting education ? A binary answer: Not enough. According to a federal study, most schools are essentially unchanged today despite reforms and increased investment in computers. The general pattern is for computers to be in a computer lab--something separate and apart like a Bunsen burner. Why? Students who have mastered the wonders of the Internet at home know that with a desktop computer they can do everything faster--take and save notes, write and do research. With guidance, kids can learn these skills at home, especially when high-quality interactive programming becomes more widely available in science, history, math, geography, and languages. There is much work to be done in creating these electronic assets, however. And it is critical for teachers to join the revolution--to adapt information technology to the methods and content of their instruction. Goodbye, Mr. Chips--hello, Mr. Chip! What does this mean? It means a teacher can take the class around the world electronically to look at the development of civilizations in Egypt, Greece, Rome, Latin America. A Spanish class in Idaho can talk to students in Bilbao. It means linking biology students in Chicago with a researcher at a microscope in San Francisco, history students with a curator at the National Portrait Gallery, technology students with the National Air and Space Museum in Washington. Just think, teachers using digitized collections of Civil War photographs and oral histories can immerse students in original building blocks of American history. Students can take virtual trips and collaborate with other students around the world and research in the best libraries in the country. Teachers can compare techniques with colleagues around the country and create teaching modules on everything from calculus to cloning. Distance learning can explode the number of courses a student might take online with peers, retired experts, and master teachers and writers. Observations can be posted on the Web for use by thousands of other teachers and students. Even the smallest one-room schoolhouse in the wilds can tap into great teaching on an infinite variety of subjects. There is no limit to the possibilities. Distance learning can include Advanced Placement courses and special tutoring for the learning disabled whose talents are not developed in regular classes. With electronic links, textbooks will morph into digital versions with interactive sections, videoconferencing, and dramatic television sequences. What excitement! And all this can be kept as fresh as milk. In the language of Marshall McLuhan, video is a "cool medium" ; that is to say, it lends itself to high audience participation. Parents can also benefit by viewing their children's work online, exchanging E-mails with teachers, and watching webcasts from distance schooling. This is the 21st-century version of distance learning. What it offers is much more flexibility in time, place, and pace of instruction, an opportunity to create a superb instructional environment adapted to each school's particular needs. Reaping the benefits. Of course, teachers and school boards need to be convinced that the Internet can make their schools more effective. Look at West Virginia. In 1990, it launched a statewide effort to use technology to improve its struggling schools. Computers were gradually integrated into classes, beginning with the earliest grades, while the teachers received extensive training over seven years. The result? West Virginia jumped to 11th from 33rd on national achievement tests. To extend state-of-the-art approaches to every school in our new technological universe we also must deal with cost. Even though laptop prices are plunging, schools are going to have to develop innovative budgeting at both state and local levels to acquire the funds for technology, training, and programming. We are on the threshold of the most radical change in American education in over a century as schools leave the industrial age to join the information age. For most of the past century, our schools were designed to prepare children for jobs on factory lines. Kids lived by the bell, moved through schools as if on conveyor belts, and learned to follow instructions. But today many of these factories are overseas, leaving behind a factory-based school system for an information age. Sputnik once woke up America's leaders to how far we had fallen behind the Soviet Union. This generation's Sputnik moment arrived with the economic competition of high skills and low wages from Asia and academic performances far surpassing our own. Here with the Web is the way for America to use the marvels it created to end the regression in our competitive and academic performance. Let's get to it.

Source Citation:
Zuckerman, Mortimer B. "Classroom Revolution." //U.S. News & World Report.// 139. 13 (Oct 10, 2005): 68. //Opposing Viewpoints Resource Center//. Gale. Springfield Township High School. 10 Mar. 2010 .

lllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllll =Online Classes Can Increase Learning=

//The Education Innovator//, "Welcome to the Cyber Classroom," vol. 6, February 29, 2008. //The Education Innovator// is the newsletter of the Office of Innovation and Improvement (OII), the U.S. Department of Education.
 * Table of Contents:** [|Further Readings]

//A staggering percentage of American youths use the Internet and its applications on a regular basis to interact with one another. Therefore, appropriately integrating online classes with education can serve the needs of a wide spectrum of students. For example, the scheduling and pacing flexibility of online classes can help at-risk students graduate. Online classes also offer opportunities for students to enroll in advanced courses or specialized electives not taught at their schools due to the lack of qualified teachers or resources or scheduling conflicts. In today's information age, educators should work to end the "digital disconnect" between schools and the most Internet-savvy generation yet.// //Kevin left school when his mom went to jail. He worked long hours to support himself, so he couldn't attend regular classes. After he moved in with a cousin who convinced him to try online courses, he thought he'd give it a try. Without school or parental support, he struggled to finish the online classes on time and to pass the exams, but he found support from teachers and the flexibility he needed through the virtual classroom, and he eventually earned his high school diploma.//

//Appropriately implemented, online learning ... could play an important role in reducing the current rate of high school dropouts.// //The fictional story above, which is based on a number of real-life accounts, demonstrates how online courses can meet the needs of many kinds of students, and why these courses are here to stay. Like Kevin, middle school and high school students are dropping out in record numbers. A recent report,// The Silent Epidemic: Perspectives of High School Dropouts//, found that "circumstances in students' lives and an inadequate response to those circumstances from the schools led to dropping out." Most students surveyed for the report said that their classes were uninteresting and lacked opportunities for "real world" learning, so the students lost interest in going to school. Other reasons that students dropped out included the need to make money, to care for a family member, to raise a child, or because academic challenges caused them to fail or fall behind due to a lack of earlier preparation.// //Appropriately implemented, online learning can enable districts to provide solutions to help address each of these reasons students leave school and as a consequence, could play an important role in reducing the current rate of high school dropouts. A Project Tomorrow survey of more than 319,000 K-12 students nationwide discovered that 57 percent of high school students indicated interest in or have taken an online course in the past year, and 39 percent liked the self-pacing that online classes could provide. In 2007, the North American Council for Online Learning (NACOL) found that "42 states have significant supplemental online learning programs, or significant full-time programs, or both. Only eight states do not have either of these options, and several of these states have begun planning for online learning development."//

//The Growth of Online Learning//
//Teens are one of America's fastest growing groups of online users and consumers. Just six years ago, surveys showed that merely 60 percent of American school-aged children used the Internet. Yet as of November 2006, a PEW Internet & American Life Project survey showed a dramatic increase, with 93 percent of teenagers online regularly and more than nine in 10 Americans between the ages of 12 and 17 using the Internet. The fact is that more teens than ever before use the Internet as a way to interact with others—and it's not just to send and receive email, but to create and share information and content more often than any other age group in the country.//

//There is still a "digital disconnect" between schools and students.// //While teens are immersed in the online culture, according to a 2007 survey by the Sloan Consortium, only 700,000 public school students, mostly high schoolers, enrolled in online courses in 2005-06. While the total number represents a very small sample of the total high school population, the latest Sloan figures represent a tenfold increase over the number enrolled in online courses over their survey in the year 2000, and that number is growing. A 2002-03 National for Education Statistics (NCES) report on distance learning found that an estimated 8,200 public schools had students enrolled in technology-based distance education courses, which represents 9 percent of all public schools nationwide. That survey revealed that the percentage of schools with students enrolled in distance education courses varied substantially by the instructional level of the school. Overall, 38 percent of public high schools offered distance education courses, compared with 20 percent of combined or ungraded schools, 4 percent of middle or junior high schools, and fewer than 1 percent of elementary schools.// //While some schools do respond to and embrace this new teen culture, there is still a "digital disconnect" between schools and students. In the 2002 PEW Internet & American Life Project study,// The Digital Disconnect: The Widening Gap Between Internet-Savvy Students and Their Schools//, students revealed that the Internet helped them do their homework, and they described many other ways the Internet is used for education -related activities. Indeed, they use the Web as an "online textbook." They sift through reference materials, organize information, and study with friends through instant messaging. Students report, however, that there is a "substantial disconnect between how they use the Internet for school and how they use the Internet during the school day and under teacher direction." And even in the relatively small number of well-connected schools, students report that the quality of web-based assignments can be poor and uninspiring. Since then, there is increased acceptance of online curriculum, but many schools and teachers have not acknowledged that "online" is the way students communicate.// //It is possible, nevertheless, to provide quality online learning opportunities that engage and inspire students. The number of online providers that utilize Internet technology to deliver effective, non-traditional learning approaches to students is growing, and several states are moving ahead with legislation that will offer online curricula as a practical alternative to the traditional classroom.//

//Challenging Students Outside the Classroom Walls//
//"Harnessing the power of innovation for the benefit of American schools is fast becoming an education imperative," said Secretary [Margaret] Spellings in the introduction to the newest OII Innovations in Education Guide,// Connecting Students to Advanced Courses Online//. The Guide, along with a webcast that promoted its availability this December [2007], focuses on case studies from six providers who offer rigorous curricula to students through the Web. The online content includes a variety of Advanced Placement (AP) courses, International Baccalaureate (IB) classes, and other dual enrollment options that enable students to earn college credit while still in high school.// //The Guide gives examples of promising practices in key areas including ensuring course quality; recruiting, counseling, and supporting students; and tracking outcomes for continuous improvement. According to the introduction, the Guide's "aim is to familiarize districts and schools with the issues they must consider and address if students are to achieve success in this new form of learning." But students are ready to welcome the virtual classroom.// //Jesse, a very bright student, who found many classes uninteresting, was energized by the idea of taking more advanced classes than offered at her high school, with the idea that she could graduate early and attend college. She is taking online AP Macroeconomics in the tenth grade. She chose online classes so she can challenge herself in ways she never thought possible.// //Motivated students such as Jesse, who are looking to expand their educational options, are just as likely to find online courses beneficial as students with academic challenges. Yet, according to the National Center for Education Statistics, advanced courses in English, mathematics, science, and foreign language are unavailable to as many as a quarter of high school students. Educators say there are various reasons schools cannot offer advanced classes—lack of qualified teachers, low student interest, and students' scheduling conflicts are the most common. Online courses are one way to help overcome these barriers and bridge the gap.//

//Online education can help students succeed.//

//Florida Virtual School—An Example of a Program that Serves a Range of Students//
//One of the providers featured in the guide, Florida Virtual School (FLVS), serves students who are prepared for and interested in enrolling in AP courses and those who might benefit from a virtual classroom because of their special circumstances. Florida law requires that priority for age and grade appropriate classes be given to students from schools that are rural, low-performing, high-minority, and home- or hospital-bound students. Thanks to state funding, FLVS classes are free for all Florida students and open to non-Florida residents who pay a nominal fee.// //FLVS offers a range of online courses and tools, including the following: 11 AP courses; core academic courses such as English and mathematics; online preparation classes for Florida's statewide assessments; SAT preparation courses; and AP exam reviews. Since its inception in 1997, when it offered just five classes serving 77 students, FLVS has continued to grow. It now offers more than 85 classes serving over 31,000 students.// //Most recently, they have added a middle school program that will have regular and accelerated classes. According to one educator, "Adding middle school online courses partners very nicely with the opportunities to offer middle school students pre-AP level courses. In our experience with the online Advanced Placement, you really need to work with students at the middle school level with some kind of course that's going to excite them about advanced coursework and engage them in curriculum that they are going to need to succeed in AP at the high school level."// //Students and parents are enthusiastic about their experiences with FLVS. One parent said that FLVS "was the best thing to ever happen to my son's life.... It saved at least one kid from being lost in the system. He probably would have dropped out." Another parent said that FLVS has filled a need for them because, due to her husband's job, they live in a rural area that doesn't have the advanced courses her child needed, and the online school "allows us to stay together as a family a lot more than if she were enrolled in the local high school full-time." Another student said, "I am currently taking Marine Science Honors through the Florida Virtual School, as well as going to [my local] high school. The Virtual School is a great institution. It has allowed me to take extra courses, and now I can graduate a year early. This program has been a tremendous help to me."// //Teachers at FLVS are dedicated and engaged, collaborating from different content areas in teams known as "schoolhouses." They share perspectives about teaching so that others can gather new ideas and creatively improve their own methods. FLVS employs 425 full-time and 200 part-time state certified instructors, boasting a 95 percent teacher retention rate.// //Online education can help students succeed, giving them opportunities to take advanced courses, to take more interesting courses than those offered at their local school, or simply to provide the challenge and incentive to stay in school. They serve an important purpose in today's information age, and there is no doubt that delivering coursework over the Internet is a development whose time has come.//

lllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllll

=communication revolution of the 20th century=

Other Article
During the 20th century, Americans experienced a revolutionary change in their choices of ways to communicate. To the many inventions made in the 19th century to improve communication, the 20th century added motion pictures, radio, television, and the Internet. Most of the products created for improving communication in the 19th century improved communication between individuals. During the 20th century, new products added ways for groups or organizations to communicate to other groups, and mass communication was born.

The first major communication improvement to be commercialized in the 20th century was the motion picture. To create a motion picture, a long role of unexposed film is loaded into a movie camera and the film is advanced, then stopped and exposed, then advanced again at high speed, creating a film with a long sequence of still images on it. The movie is shown by running the developed film through a projector using an advance, stop and show, advance again sequence that matches the camera's sequence, with rapid sequencing providing the appearance of motion. With sound added to the film, a separate but synchronized soundtrack on the film is read by an optical sensor and sent to the sound system.

The first practical motion picture camera in America was invented by [|Thomas Edison]. In 1896, Edison showed a motion picture to the public in the New York City Music Hall by projecting film on a screen in the front of the auditorium, creating a lot of excitement. The movie was so successful that new films quickly followed, thus giving rise to the movie industry. Although motion pictures were mostly used for entertainment, showings often included such current news events as wars, parades, and speeches. In the 1930s, after sound had been added, newsreels covering the week's major events were shown in most theaters along with the ** movies ** and proved very popular.

Early in the 20th century, technology brought wireless communication to America. With wireless, a sending station could send a message to a receiving station using no visible means of connection, even across mountains and over water. Many people thought it was done by magic. Actually, the signal was carried by electromagnetic radiation, just as light is, only at a lower frequency than people are able to see. Depending on the transmitting frequency, messages could be broadcast anywhere in the world and even to outer space. Examples of wireless devices available today are radios, televisions, satellite dishes, and hand-held wireless devices like cell phones and wireless computers.

In 1895, Italian inventor Guglielmo Marconi transmitted the first wireless [|telegraph] message. In 1901, he used radio telegrams to communicate with ships traveling across the Atlantic Ocean between England and Newfoundland, Canada. Using radio telegrams, ships informed other ships of bad weather or ice conditions near them, and ships informed the shore of such disasters as shipwrecks. The //Titanic// sinking in 1912 was a famous example of both the use of the wireless telegraph—by broadcasting the sinking—and of the hazard of ignoring it, as the ship's wireless operators had not read telegrams that warned them of icebergs ahead.

In 1906, the first voice messages were sent from America to ships in the Atlantic Ocean. Many historians credit Reginald Aubrey Fessenden, a physicist, for sending the first voice message. From this beginning, voice radio rapidly grew, creating such new applications as amateur (Ham) radio; broadcasting; and eventually portable radios for police, military commanders, railroads, and airlines. To support the applications, new radios were developed that were more rugged, smaller, and less expensive, and new transmitters became available that were more powerful and could reach more people.

Experimental broadcasting to a mass audience started in 1910 with a program by the famous singer Enrico Caruso at the Metropolitan Opera House in New York City. By 1920, several experimental broadcasting stations had converted to commercial stations by broadcasting programs on a regular basis, including such news as the results of the 1920 presidential election. Because radio was a good way to communicate with large groups of people, broadcasting rapidly consolidated into national networks in order to attract [|advertising] revenue to support news and entertainment programming. The [|Radio Corporation of America] (RCA) created the first nationwide broadcast network, the [|National Broadcasting Company] (NBC), in 1926.

Most radio broadcasting today is by local stations supporting specialized markets but still dependent on advertising revenue. Radios used by business also navigate ships and airplanes, coordinate emergency services, schedule and track train and truck movements, and operate remote equipment from central locations, like that used in [|space exploration]. Ham radio is still an active hobby for many people.

The simultaneous introduction of motion pictures and radio to American audiences in the early 20th century whetted appetites for combining the two media, partly because motion pictures at that time did not have sound. Clearly, a tremendous opportunity existed for broadcasting pictures with synchronized sound, and there were no scientific reasons why wireless could not send the images. Almost from the start of commercial broadcasting, inventors were working on ways to transmit images.

In the late 1920s, many attempts were made to create an experimental telecast, and a few met with success, particularly RCA's efforts. In 1936, NBC provided 150 experimental television sets to homes in New York City and sent telecasts to them, the first show being the cartoon "Felix the Cat." By 1939, NBC was providing regular telecasts but to a limited market. When the United States entered [|World War II] in 1941, however, all television projects were suspended until the war ended in 1945.

After the war, television development continued where it left off, with the invention of better television sets, creative programming, and larger markets. The first coast-to-coast program was President [|Harry Truman's] opening speech at the Japanese Peace Treaty Conference in 1951. By the 1950s, television had become a profitable industry. In 1953, the first color telecast was made, which spread so fast that by the 1960s, most telecasts were in color.

Television continued to grow, connecting Europe to America by satellite in 1965, adding publicly funded networks like the Public Broadcasting System in 1967, televising from the moon in 1969, and delivering programs by satellite and cable providers by the 1970s. The introduction of portable video cameras brought television production into the home and office. In addition, the combination of cameras and video tape players also provided a way for organizations to produce educational documentary programs to support special needs, which mostly eliminated home and educational motion pictures.

In the 1960s, the [|Department of Defense] created an open network to help academic, contract, and government employees communicate unclassified information related to defense work—the Internet. During the 1980s, the defense functions were removed from the network, and the National Science Foundation operated the remainder, adding many new features to the network and expanding its use around the world. Today, the Internet is a critical component of the [|computer revolution], offering electronic mail (e-mail), chat rooms, access to the wealth of information on the [|World Wide Web], and many Internet-supported applications.

The Internet has had a dramatic impact on American life. E-mail is rapidly replacing long-distance telephone calls, and chat rooms have created social groups dedicated to specific subjects, but with members living around the world. The Internet has not only changed how people communicate but also how they work, purchase, and play. Many people now work at home, using the Internet to stay in touch with the office. People have also begun to use the Internet for banking and shopping services rather than so-called brick-and-mortar locations.

The communication revolution of the 20th century did create a new array of social problems that will have to be addressed in the 21st century. While people have access to more information than ever before, the continual bombardment of that information, much of it unfiltered and invalidated, has created several generations of children who are seemingly immune to extreme violence. Health concerns are also an issue, since Americans spend less time in outdoor activities and more time sitting in front of the television or computer. The online nature of the Internet will also make individual and corporate [|privacy] one of the major issues of the near future.

[|back to top] ID: 263202

llllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllll =American film=

Other Article
Though film originated in Paris in 1895, the United States has dominated the motion picture industry since [|World War I]. Film has profoundly influenced the history of the nation over the past century and indelibly shaped the world's image of the United States.

Several inventors working independently in the United States and Europe developed film technology in the late 19th century. Much of the work for the development of the motion picture projector was undertaken at [|Thomas Edison's] laboratory in Menlo Park, [|New Jersey]. By 1895, entrepreneurs in Europe and the United States were charging audiences for displays of the new medium.

The first films were short, rarely lasting more than a few minutes, and functioned as visual effects similar to magic tricks. Such ** movies ** as //The Great Train Robbery// (1903), however, signaled film's future as a powerful medium of storytelling. By World War I, novelty shorts had given way to long films with plots and character development. The unprecedented commercial success of D. W. Griffith's three-hour drama //[|Birth of a Nation]// (1915) established the feature film as the standard for the medium and the United States as the international center of the industry.

After World War I, the motion picture industry underwent several significant changes, none more important than a transformation in the mode of film production. Early filmmakers like Georges Méliès and Edwin S. Porter had controlled all aspects of their work, from such technical tasks as script writing and editing to distribution and promotion. By 1920, such men had been replaced by teams of specialists. Another significant change was the relocation of the industry from the east coast of the United States to Hollywood, a suburb of Los Angeles in southern [|California]. A third important trend was the emergence of several film studios that came to dominate production and distribution during the interwar era.

The meteoric rise in popularity of cinema in the years immediately preceding World War I had encouraged a proliferation of independent film companies throughout the United States and Europe. However, as the industry gravitated toward California, a handful of powerful production companies managed to appropriate control over all aspects of the industry, from production to distribution to exhibition. Those studios developed a vertically integrated system in which a handful of businessmen dominated all aspects of the film industry, from movie production to theater management. This "studio system" would dominate filmmaking until the 1950s.

In the post-World War I era, movies became a popular leisure activity for Americans of all social classes, and the growing sophistication of filmmaking ushered in a golden age of silent films during the 1920s. Cinema actors became internationally known celebrities, and American films came to be shown all over the world. As Hollywood emerged as the center of the global film industry, it drew artists from Broadway and Europe. Among the notable foreigners to succeed in Hollywood were the film stars [|Douglas Fairbanks] and [|Charlie Chaplin] and such directors as [|Erich von Stroheim] and [|Alfred Hitchcock].

The end of the silent era, which was signaled by the 1927 musical //The Jazz Singer,// welcomed a new generation of stars and a variety of new film genres (like musicals) that took advantage of sound technology. The advent of "talkies" undermined the popularity of foreign films in the American market, which increased Hollywood's advantage over German and French studios and made "Hollywood" synonymous with "movies" for much of the Western world.

The 1920s also saw the advent of stringent [|censorship] of the film industry. In 1922, the studio heads hired the politician [|Will Hays] to establish a censorship board to supervise standards of filmmaking. The advent of sound film, and the growing popularity of more risqué stars like [|Mae West], encouraged the creation in the 1930s of a more exacting production code, which discouraged references to or images of sex and violence.

By 1930, the structures and formulae of modern commercial motion pictures had been established. The feature-length talkie, with its linear narrative, was the standard for commercial films. Most films also fit into one of several genres, including romantic comedies, westerns, gangster films, and musicals. Five companies controlled the production and distribution of virtually all films, and a "star" system developed in which those studios signed popular actors to long-term contracts and carefully managed their careers. In the 1930s, the most commercially successful movies relied on the popularity of a handful of stars like [|Cary Grant], [|Clark Gable], Claudette Colbert, [|Judy Garland], and [|Jimmy Stewart].

[|World War II] brought further change to the industry. During the war, the government enlisted Hollywood to produce propaganda films for the war effort, and most feature films embraced patriotic themes that appealed to war-time audiences. At war's end, however, a darker, more pessimistic genre, dubbed //film noir,// emerged that reflected the disillusionment and uncertainty felt by many in the new nuclear age.

In 1946, more Americans went to the motion pictures than ever before. In the next decade, however, audiences began to decline, largely due to the movement of many Americans to the suburbs and the advent of television. The industry faced a further challenge after the federal government declared the studio system an illegal monopoly in 1948 and forced it to divest itself of ownership of movie theater chains. That break-up of the industry's vertical integration eventually ended the era of the studio system.

The industry responded to the challenge of television by experimenting with new techniques for exhibiting films, including 3-D and Cinemascope, which offered audiences experiences that could not be reproduced on the small screen. Hollywood also returned to making such spectacle films as //Ben-Hur// (1959), starring [|Charlton Heston], and //Cleopatra// (1963), starring [|Elizabeth Taylor], that took advantage of the big screen.

Since teenagers and young adults were one segment of the population that had not abandoned the movies, more and more films were targeted toward their tastes. Thus, in the 1950s, such young actors as [|James Dean] and the singer [|Elvis Presley] ranked among the biggest box-office attractions. Children and teens were also the main audience of the low-budget science fiction films that proliferated during the decade. The dwindling audiences also pressured the studios to allow directors to challenge the production codes, which were gradually relaxed during the late 1950s and ultimately abandoned by the end of the 1960s.

By the early 1960s, American movies had become dominated by actors and writers who had learned their craft in television, and many critics lamented the demise of innovation and creativity in the industry. Then, in the late 1960s and early 1970s, a new generation of filmmakers began making movies that reinvigorated and revised many old genres. Such films as //Butch Cassidy and the Sundance Kid// (1969), //The Godfather// (1972), and //The Exorcist// (1973) breathed new life into classic genres and brought audiences back into the theaters. Though less critically acclaimed, the box office smashes //Jaws// (1975) and //Star Wars// (1977) launched the careers of two directors, [|Steven Spielberg] and [|George Lucas], who would become the most commercially successful filmmakers of all time. Their films also set a precedent for sophisticated special effects and production costs that made filmmaking more expensive than ever before.

The 1980s saw the advent of video cassette players, which threatened to send audiences back home to their televisions. Video actually increased the profitability of the studios, however, by attracting new audiences and encouraging many people to see films multiple times. By the end of the 1980s, the industry was making more money from video than from theater screenings. The 1990s saw a continuation of the trends of the 1980s, with audiences growing and motion pictures becoming more expensive. Perhaps the film that typified the decade was James Cameron's //Titanic// (1998), which set records for both production costs and box office profits.

As films became more expensive, studios were increasingly wary of taking risks, as a film like //Titanic// could have bankrupted its production company. In the 1930s, studio heads had prided themselves on producing a certain number of prestige films that might not turn a profit but that gave the studio an artistic cachet. By the 1990s, such films had become an unaffordable luxury for most studios. However, the 1990s did see the growth of a vibrant independent film industry, which relied on lean production methods to make films that usually enjoyed a limited release.

In 1995, film celebrated its centenary anniversary. In 100 years, American cinema has become one of the most potent cultural forces on the planet. Hollywood has introduced the world to an idealized lifestyle and set of values that have irrevocably shaped the world's view of the United States. It has also encouraged the development of a national culture by creating stars whose popularity is shared throughout all regions of the country. Film stars like [|John Wayne] remain heroes to a cross-section of Americans and continue to represent a vision of America to the rest of the world.

[|back to top] ID: 379980

"American film." //American History//. ABC-CLIO, 2010. Web. 24 Feb. 2010. .
lllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllll Article 3 =Warner Brothers=

Group / Organization
A global leader in the production and distribution of entertainment for audiences throughout the world, the Warner Brothers studio was founded in 1923 by Harry, Albert, Sam, and Jack Warner, who were four Jewish brothers from Krasnashiltz, Poland. The brothers purchased and moved onto the First National Pictures lot in Burbank, California, where the Warner Brothers studio has remained ever since.

Prior to 1903, Harry Warner saw his first motion picture during a trip to Pittsburgh. Enthralled, he convinced Albert and Sam to join him in traveling from town to town and showing ** movies ** in theaters. Jack, who had been a minstrel singer, soon joined them as did their sister, Rosie. Jack sang before each movie, and whenever the film broke, Rosie played the piano. Soon, the Warners saved enough money to open their own theater in Newcastle, Pennsylvania.

The Warners and many other theater operators encountered a problem—they could never rely on movies being shipped to them on time or being shipped at all. Consequently, Harry got the idea to form an alliance with other theater operators to pressure the studios to deliver the movies on schedule. From that idea, the nation's first movie distributorship, the Duquesne Film Exchange evolved. The outfit suffered, though, when producers opposed it as a drain on their profits, and in 1912, the Warners sold out.

Their setback convinced them to make their own movies. They began shooting slapstick comedies in New York, calling them "Warner Features." In 1917, they paid James W. Gerard for permission to make his popular book //My Four Years in Germany// into a movie. Released the following year, it won critical acclaim and attracted large audiences. Within a few months, the Warners built a studio on Sunset Boulevard in Hollywood, California and several years later, incorporated as Warner Brothers Pictures in 1923. A year later, the company created and filmed the first canine star, Rin Tin Tin. In 1925, they purchased a distribution company called Vitagraph Studios, a move that gave them an advantage over their competitors. Harry and Sam then worked with Bell Laboratories on a cumbersome system called Vitaphone that allowed them to add sound to movies.

In 1926, the studio produced //Don Juan,// the first movie with sound effects, and in 1927, it produced //The Jazz Singer,// the first movie to have partial sound dialogue. The film's release was triumphant yet tragic, as Sam died the day before the premiere, and none of his brothers were able to attend the opening. The first full "talking picture," //Lights of New York,// was produced by the Warner Brothers studio in 1928, revolutionizing the movie industry and ending the era of silent films forever.

With its reputation for innovation, the studio attracted such big stars as [|Humphrey Bogart], [|James Cagney], [|Bette Davis], Errol Flynn, Leslie Howard, and [|Edward G. Robinson]. In the 1930s, the Warners acquired the Stanley Company of America, which owned hundreds of movie theaters. Following Harry's advice, the studio soon bought radio companies and music publishers and paid large sums to attract actors from its competitors. Throughout the decade, Warner Brothers made spectacles and movies dealing with social issues. In the 1930s, the company ushered in the era of the tough guy genre with such films as //Little Caesar// (1930), //The Public Enemy// (1931), and //Scarface (//1932). From the 1930s through the 1950s, Warner Brothers produced many notable films, including //42nd Street// (1933), //Captain Blood// (1935), //The Maltese Falcon// (1941), //Now Voyager// (1942), //Casablanca// (1942), //A Streetcar Named Desire// (1951), //Mister Roberts// (1955), //Giant// (1956), and //Auntie Mame// (1958).

In the 1960s, television greatly damaged Warner Brothers, and by 1969, it had stopped making movies and based its survival on profits from its two record companies, Warner/Reprise and Atlantic. Kinney Services then bought control of Warner (renaming it Warner Communications) and in the 1970s, started making movies again, among them the hits //Woodstock, The Exorcist,// and //All the President's Men.// Meanwhile, Warner Brothers also became an innovator in television with the introduction of the television miniseries. In 1990, Warner merged with Time, Incorporated, creating Time Warner, an entertainment giant. Five year later, the company launched a new cable television company dubbed //The WB.// Today, in addition to its other enterprises, including cable television, Time Warner dominates the music industry with a 21% share of the market. In 2001, Time Warner merged with [|America Online] (AOL). Currently, Time Warner oversees AOL, Home Box Office, New Line Cinema, Time Warner Cable, [|Turner Broadcasting System], the WB, and Warner Brothers Entertainment. Time Warner currently employs 80,000 people.

Among the surviving founders of Warner Brothers after [|World War II], Albert served as the studio's treasurer until his death on November 26, 1967. Harry served as president until his death on July 25, 1958 and was survived by his wife Rea Levinson; they had four children. Jack held the vice presidency, and after Harry's death, the presidency and supervised the selection of scripts and hiring of actors. He married several times, and his son, Jack Jr., worked for the studio. Jack Sr. died on September 2, 1978.

[|back to top] ID: 391917

"Warner Brothers." //American History//. ABC-CLIO, 2010. Web. 24 Feb. 2010. .
lllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllll

=computer revolution=

Other Article
The computer revolution, sometimes called the information revolution or the Third Industrial Revolution, resolved an age-old search by humankind to invent a machine that could perform mathematical and logical computations in order to help people solve complex problems. The result of that search culminated in the development of a machine, the computer, that is changing how almost everyone lives and works.

A computer is a family of products that executes arithmetic and logical operations that help people solve problems, perform tasks, control processes, communicate with others, and even play games. Most computers are made of an integrated set of devices that process instructions (through logic circuits and memory), store data (on hard drives and removable drives), interact with the user (by employing keyboard, mouse, and display terminals), communicate with other computers (through telecommunication), and execute custom functions (with the use of application software).

One critical feature that differentiates a computer from such numerical devices as adding machines or calculators is the computer's ability to execute problems by employing logic. In other words, when presented with two options, computers are instructed to execute Task A if a certain set of circumstances is true or execute Task B if another set of circumstances is true. In practice, logic operations are much more complex. The instructions that guide computers in their operations are known as programs. Computer programmers spend much of their time, when writing programs, reducing complex problems in logic to statements similar to this basic equation.

Although a useful, general-purpose computer did not evolve until the late 1940s, historically, the new technologies that were required to develop a computer evolved over a long period of time. For example, calculation devices, including the abacus, the slide rule, and the mechanical calculator existed well before the [|American Revolution]. Many discoveries helped make the computer what it is today, but the most crucial developments were punched cards, Boolean logic, tabulating machines, electronic computers, transistors, peripheral devices, telecommunications, personal computers, and the Internet.

In 1801, French weaver Joseph Jacquard invented a method to use boards with holes punched in them in a chosen pattern to guide a needle in a weaving loom. If the needle encountered the board when it moved, it did not push a thread into the loom; if it encountered a hole, it pushed the thread through at that spot in the pattern. That approach led to the punched card, which was used later to enter information into a computer.

Building on that idea, a British mathematician named Charles Babbage created in 1830 a concept for a steam-driven computer that contained logic, memory, storage, punched card input, and a printer. He called his invention an Analytical Engine, which in principle could solve complex problems. Although he created a design for the machine, he did not have the money to build it. His concept, however, defined the basic architecture for future computers.

In 1854, a British mathematician named George Boole defined the concepts behind binary algebra, marking one of the most significant mathematical breakthroughs for the later development of computer technology. Also known as Boolean algebra, binary algebra represents information by strings of digits, with each digit representing a zero or one (numbers that can also be translated as either a true or false). The string as a whole represents a piece of data. Computers use the same concept today but in more complex ways.

Boole also defined how a small set of logical operators, typically "or," "and," "not," and "not and," when coupled with such branching logic as "if," "then," and "else," could mathematically solve logic problems based only on digits containing zero or one. Boole's approach, called Boolean logic, is the fundamental basis for designing logic circuits and serves as a basic building block for most programming languages.

In 1888, Herman Hollerith, an American engineer working under contract to the U.S. Census Bureau, created the first practical mechanical computer to speed up the processing of data collected in the 1890 census. He used punched cards to enter the data (reading the holes with electromagnets), tabulated the data, and then printed the results with a simple printer, all of his own design. His computers completed the census processing in just six weeks, which was much faster than the 10 years required to manually tabulate the 1880 census.

In 1896, Hollerith started the Tabulating Machine Company to build tabulating machines similar to his census machines. In 1911, he sold the company, which was then merged into the Computing-Tabulating-Recording Company (C-T-R). C-T-R's main products were time clocks and computerized scales. The owners appointed Thomas Watson as general manager, and by 1924, Watson had gained control of the company, renaming it [|International Business Machines] (IBM).

Over the next 25 years, IBM produced business machines for accounting and other purposes, eventually dropping the scales and time clocks from its production lines. The company grew rapidly, particularly during [|World War II] when accounting machines were needed for the war effort. After the war, IBM's success with business machines placed it in a perfect position to capitalize on recent developments in the fledgling computer industry.

In 1937, British mathematician Alan Turing had formulated a concept, later embodied in the Turing machine, that defined how to restructure mathematical algorithms so they could be programmed in a computer. He also described how to use symbolic coding, which allowed programmers to use letters and numbers to represent the binary zeros and ones in their programs, providing the foundation for computer software.

Meanwhile, [|Iowa] State University physicist John Atanasoff had built the first digital electronic computer in 1939 to solve sets of linear equations, using Boolean algebra and two modified IBM accounting machines. In 1944, [|Harvard University] engineering professor Howard Aiken had built a more powerful machine, called the Mark I, using a similar approach. Both computers used electromechanical relays to process binary numbers but neither was meant for general-purpose applications.

In 1946, [|John Presper Eckert Jr.] and [|John Mauchly], both at the University of [|Pennsylvania], built the first general-purpose digital electronic computer, called the Electronic Numerical Integrator and Computer (ENIAC), which used vacuum tubes instead of electromechanical relays to process data. ENIAC was a thousand times faster than the Mark I. As with other early computers, the ENIAC was programmed by rewiring the various interconnections to execute instructions.

In 1951, the Remington Rand Company (later Sperry Rand) shipped the first Universal Automatic Computer, or Univac, using computer designs created by Eckert and Mauchly—and the computer race was on. IBM shipped its first general purpose computer, the IBM 701, in 1952, followed by such companies as the [|General Electric Company], Honeywell, Control Data Corporation, Burroughs, [|RCA], and others. Over the following years, computer companies merged and restructured many times to keep up with the rapidly changing computer market.

At the same time, developments in peripheral devices were improving the usability of computers as well. In the late 1950s, tape drives were added to permanently store data on a media that was much easier to handle than punched cards, which must be kept in precise order to function properly. Tape drives stored data as files of information that could be transmitted much faster than from cards.

In the early 1960s, IBM invented a rotating disk drive for permanently storing data, called Random Access Method of Accounting and Control (RAMAC), from which the computer could read or write and thereby obtain programs and data directly from the disk. The disk drive, now called a hard drive, greatly increased the usability of the computer and made many online applications possible. Over the years, the disk drive has gotten physically smaller while its data capacity has expanded.

Another device introduced in the 1960s was the monitor, a cathode ray tube similar to a television screen. The monitor lets users interact directly with computers, running online programs to enter data and gather information. By the 1970s, the monitor had eliminated punched cards.

Also during the 1960s, computers were connected to each other through networks to transfer information over telephone wires. Modulator/demodulator devices, called modems, provided the functions that made the connection and then transmitted and received the digital data over telephone networks. When monitors were connected to computers through the telecommunications network, the user and the processor could be located in separate places, miles apart.

Computer software programming has evolved along with hardware. Operating systems manage the interfaces between hardware and applications, while application support software manages interfaces between the application and telecommunications or database systems. Such computer languages as FORmula TRANSlation (FORTRAN), Common Business Oriented Language (COBOL), and PL/1 make programming easier. Computer-based applications help users with accounting, time reporting, claims adjusting, word processing, online programming, and many other application needs. Software made the computer a general-purpose machine, offering support to a whole range of industries, including science, banking, marketing, publishing, and manufacturing.

Meanwhile, two decades earlier, research on the experimental microchip had begun. In 1947, [|John Bardeen], [|Walter Houser Brattain], and [|William B. Shockley] of the Bell Laboratories research organization of the American Telephone & Telegraph Company developed the first transistor on a microchip to switch electric current on and off and amplify voltage, performing the functions of vacuum tubes, resistors, and capacitors on one chip. Microchip circuits built from transistors could also be replicated many times on a single chip, making devices that performed the functions of a processor chip (to execute logic) or a memory chip (to store the programs and data while the program executed). By chemically etching circuits on the surface of a silicon wafer, millions of circuits could be placed on a one-square-inch chip. From those chips, electronic devices were miniaturized to the point that products are now available for uses that were not even dreamed of in the 1950s.

Continuing that pathbreaking research, during the 1960s, microchips replaced the vacuum tubes used for logic processing and the ceramic cores imbedded around woven wires used for memory. With the vacuum tubes and core memories, computers were huge, consumed a lot of electricity, failed often, and were very expensive to build. Microchips were much smaller, sturdier, and considerably less expensive to produce.

By the 1970s, computers had evolved into large and powerful machines, called mainframes, that could process huge amounts of information but required large support staffs for systems management, programming, problem resolution, and help desk support. Mostly, they were general-purpose computers, but some special-purpose ones had evolved, most notably [|supercomputers] (to run such data-intensive applications as weather forecasting), process control computers (to control industrial machines), and minicomputers (to run small business applications).

During the 1980s, workstation computers became available to run such specialized business applications as scientific modeling, engineering design, automated teller machines (ATMs), and store checkout machines. These machines are true computers, running programs on a client's workstation but storing data and communicating with others through mainframes acting as servers. In a client-server system, the servers provide connection, security control, data management, and system administration for the client workstation.

Within many companies, departments and projects needed workstation-based systems to provide local support for applications that were unique to them but did not justify a separate mainframe. The systems, called departmental systems, managed data and interface with other networks on a company mainframe server while running applications on a client workstation. That technological [|integration] gave them the best of both worlds. Client-server networks became very popular in business.

Starting in the 1970s, computers were developed to provide support for individuals to use at their own discretion, particularly at home, hence the personal computer (PC). PCs use a central processing unit (CPU) called a microprocessor—a chip that performs the arithmetic, logic, and control functions needed to compute. In addition, PCs contain memory modules, sound and graphic cards, a hard drive, and a monitor, as well as some combination of the following peripheral devices: keyboard, mouse, floppy disk, tape drive, ZIP drive, speakers, joy stick, game controller, modem, printer, scanner, digital camera, and others. No other computer supports such an array of peripheral devices.

In 1975, the first personal computer, the Altair, was released as a kit to be assembled by hobbyists. The kit also provided an operating system developed by [|Bill Gates] and Paul Allen, who founded the [|Microsoft Corporation]. Only two years later, in 1977, Steve Jobs and Stephen Wozniak founded [|Apple Computer], Inc. and released the Apple II personal computer (the Apple I was for hobbyists). Then, in 1984, Apple released its Macintosh computer, a desktop computer based on a graphical user interface (GUI) that was very popular with millions of individuals and small businesses.

In 1981, IBM shipped its first personal computer, using a disk operating system (DOS) created by Microsoft. Then, IBM published for widespread distribution their computer interface definitions, created the IBM-compatibles computer boom. Meanwhile, in 1985, Microsoft shipped its first version of the graphical Windows operating system. At this point, every personal computer manufacturer except Apple and the IBM-compatible producers became insignificant. Even IBM and Apple did not benefit as much as the new companies producing IBM-compatible computers, particularly Compaq, Gateway, Dell, and many others.

Creating software for the PC also became a large industry because home users do not usually program their own applications but purchase them instead. The software industry produces popular applications for office support, finance, tax preparation, and games. The corporate giant among those producers is Microsoft, which dominates the software industry with both its Windows operating system and a host of other application software. Companies use many of the same programs to support business applications, usually as part of a departmental system. The introduction of personal computers, prompting a general revolution in the public's computer skills, has altered all aspects of business computing.

Personal computers have evolved into other forms over the last 20 years, examples being laptops, notebooks, and hand-held devices. Many of these developments are due to the increasing popularity of the Internet. The Internet's most common uses are to transmit electronic mail (e-mail), provide access to chat rooms, download files, and run Internet-compliant applications that allow users to search for information, publish documents, sell products or services, conduct financial transactions, and play games. The popularity of the Internet has led to another variant of the PC—the network PC—where most data storage and programs are obtained from Internet servers on the network rather than being installed directly on individual computers.

Recently, microprocessors have been imbedded in other devices as well, including automobiles, guidance systems, and satellites, to control their functions without the user having to interact with the computer. Over time, microprocessors will be added to such devices as digital wristwatches, cell phones, calculators, cameras, radios, televisions, home heating systems, and appliances.

The computer revolution is not without its problems, however, mostly due to the rapid social changes it has prompted in lifestyles. Computers have changed how people work, shop, and play. E-mail is replacing long-distance telephone calls, and chat rooms offer worldwide online groups dedicated to subjects of mutual interest. People are also using online versions of banking and purchasing rather than shopping at stores. Many speculate that a substantial movement among workers to work from their home computers, rather than commuting into an office, will have the biggest impact on the way people live. While supporters maintain that computers and the Internet have allowed people to explore new interests and introduced them to vast amounts of information on all sorts of topics, critics counter that people are becoming more and more isolated, substituting computer activity for social interaction with fellow human beings.

Computers have also changed the types of skills needed by working Americans. By automating work, computers have increased productivity and reduced the cost of products. Companies now need fewer clerical and manufacturing workers and more employees with engineering, programming, and business backgrounds to design, develop, and manage the increasingly complex systems—reducing opportunities for people without a college [|education] and increasing opportunities for people with college degrees.

Even with their problems, computers should become even more important to most Americans in the future. Such new advanced features as voice recognition, streaming video, and artificial intelligence will prompt even more changes for society. The revolution rolls on.

[|back to top] ID: 263203

"computer revolution." //American History//. ABC-CLIO, 2010. Web. 25 Feb. 2010. .
llllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllll


 * ||  || [[image:http://digitalstorytelling.coe.uh.edu/graphics/logo.jpg]] ||
 * Digital Storytelling is the practice of using computer-based tools to tell stories. As with traditional storytelling, most digital stories focus on a specific topic and contain a particular point of view. However, as the name implies, digital stories usually contain some mixture of computer-based images, text, recorded audio narration, video clips and/or music. Digital stories can vary in length, but most of the stories used in education typically last between two and ten minutes. The topics that are used in Digital Storytelling range from personal tales to the recounting of historical events, from exploring life in one's own community to the search for life in other corners of the universe, and literally, everything in between. A great way to begin learning about Digital Storytelling is by watching the following video introduction to Digital Storytelling.
 * Digital Storytelling is the practice of using computer-based tools to tell stories. As with traditional storytelling, most digital stories focus on a specific topic and contain a particular point of view. However, as the name implies, digital stories usually contain some mixture of computer-based images, text, recorded audio narration, video clips and/or music. Digital stories can vary in length, but most of the stories used in education typically last between two and ten minutes. The topics that are used in Digital Storytelling range from personal tales to the recounting of historical events, from exploring life in one's own community to the search for life in other corners of the universe, and literally, everything in between. A great way to begin learning about Digital Storytelling is by watching the following video introduction to Digital Storytelling.

[|click here] for more information about this video British photographer, educator and digital storyteller, [|Daniel Meadows] defines digital stories as "short, personal multimedia tales told from the heart." He maintains that the beauty of this form of digital expression is that these stories can be created by people everywhere, on any subject, and shared electronically all over the world. Meadows goes on to describe digital stories as //"multimedia sonnets from the people" in which "photographs discover the talkies, and the stories told assemble in the ether as pieces of a jigsaw puzzle, a gaggle of invisible histories which, when viewed together, tell the bigger story of our time, the story that defines who we are."//
 * An Introduction to Digital Storytelling**

We invite you to explore the resources on this website and contact us if you have questions, comments, or ideas about the Educational Uses of Digital Storytelling. ||

lllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllll

Educators at all levels can use Digital Storytelling in many ways, from introducing new material, to helping students learn to conduct research, synthesize large amounts of content and gain expertise in the use of digital communication and authoring tools. It also can help students organize these ideas as they learn to create stories for an audience, and present their ideas and knowledge in an individual and meaningful way. //Teachers can:// //Students can://
 * Educational Goals of Digital Storytelling**
 * Create a digital story for use as an anticipatory set or hook for a lesson;
 * Enhance current lesson plans with the use of a digital story within a unit;
 * Assign student-created stories which requires students to research a topic from a particular point of view.
 * Learn to use the Internet to research rich, deep content while analyzing and synthesizing a wide range of content;
 * Develop communication skills by learning to ask questions, express opinions, construct narratives and write for an audience;
 * Increase their computer skills using software that combines a variety of multimedia including: text, still images, audio, video and web publishing.

Although not a comprehensive list, digitial storytelling can be used to:
 * Educational Objectives of Digital Storytelling**
 * Appeal to the diverse learning styles of students by using Digital Storytelling as a presentation media;
 * Generate interest, attention and motivation for the "digital generation" kids in our classrooms;
 * Capitalize on the creative talents of your own students as they begin to research and tell stories of their own;
 * Publish student work on the Internet for viewing and critiquing by others;
 * Promote the accomplishment of cross-curricular academic standards and learning objectives.

lllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllll

=Steven Spielberg=

Individual
Perhaps the most financially successful filmmaker of all time, Steven Spielberg has also had an incalculable effect on American popular culture. With such films as //Jaws,// //E.T., Indiana Jones and the Last Crusade, Schindler's List,// //Jurassic Park//, and //Saving Private Ryan// to his credit, no one has thrilled American movie audiences more or fueled more late-night nightmares than Spielberg. One of his trademarks as a filmmaker is his ability to portray his subjects with the innocence and awe of a child.

Born in Cincinnati, Ohio, on December 18, 1946, Spielberg was the oldest of four children. His father was an engineer and a pioneer in the new field of computers, and his mother was a concert pianist before she had children. The Spielbergs were often the only Jewish family in their neighborhood (they also lived in New Jersey, Arizona, and California), and the issue was a sensitive one for their only son, who just wanted to be like everyone else.

As a boy, Spielberg was awkward and often the target of bullies. However, starting at age 12, he took solace in his hobby of making films with his father's eight-millimeter camera. Sparked by inspiration after seeing [|Cecil B. DeMille's] film //The Greatest Show on Earth//, Spielberg used the willing help of his creative, eccentric mother and three sisters to make his short horror ** movies ** at home, and they served as both technical helpers and murder victims. As the filmmaker later recalled, "I killed them all several times."

By the time he was 13 years old, Spielberg had mastered the art of filmmaking sufficiently to win a contest with his 40-minute war film //Escape to Nowhere//. When he was 17 years old, he made the science fiction epic //Firelight//, his first feature-length film. The family hired a local theater to show the movie and in one night earned back the $500 it cost to make the film. Spielberg continued to practice filmmaking through his studies at Saratoga High School in California, partly as a way to stave off unhappiness over several [|anti-Semitic] incidents at school and because his parents were on the verge of divorce.

When it came time for Spielberg to go to college, he discovered that none of the film schools he wanted to attend would accept him because of his poor grades in high school. He finally settled on California State College at Long Beach, where he earned a bachelor's degree in English in 1970. While he was in college, Spielberg's fascination with films continued to grow. He spent much of his time studying movies and learned how to bluff his way past the security guards at Universal Studios. Once inside, he found an empty office that wasn't in use and "just assumed that people assumed that I was somebody's son."

From the office, Spielberg tried to talk Universal Studios producers into evaluating his films. One of his movies of that period, //Amblin'//, won awards at the Atlanta and Venice film festivals, although he later condemned the work as "an attack of crass commercialism." Spielberg finally succeeded in getting Sidney Sheinberg, the head of Universal Studios' television division, to look at the film. Sheinberg was so impressed that he signed the 21-year-old filmmaker to a seven-year contract.

Spielberg's first assignment for Universal was to direct the legendary actress [|Joan Crawford] in the first installment of the television anthology series //Night Gallery//. For the rest of his apprenticeship, he directed episodes of //Marcus Welby, M.D.//, //The Name of the Game//, //Columbo//, //Owen Marshall//, and //The Psychiatrists//. Having jumped through those hoops, Spielberg finally got signed to direct his first made-for-television movie, //Duel//, starring Dennis Weaver. When it came out, the critics raved about the movie's fluid editing and the way tension built smoothly through the whole show. Some even called //Duel// the best television movie ever made.

In 1974, Spielberg made his feature film debut with //Sugarland Express//, a comedy-drama starring Goldie Hawn. Reviewers were amazed by the skill of the new filmmaker and complimented him for taking on such a complex film that required so much precise, technical camerawork. Despite critical praise, the movie did poorly at the box office. Spielberg was disappointed at the turnout for //Express// but turned his attention to the new movie Universal wanted him to make, an adaptation of Peter Benchley's novel //Jaws//. If his last film had not drawn much of a crowd, //Jaws// more than made up for it when the movie came out in 1975.

Starring Roy Scheider, Richard Dreyfus, and Robert Shaw, the terrifying film about a man-eating great white shark broke all the records for box office receipts. Encouraged by that movie's success, Spielberg went on to write and make a science fiction film, //Close Encounters of the Third Kind//, starring Dreyfus again as a one of a group of people who become obsessed with alien spaceships. That film, in addition to receiving tremendous reviews and earning millions of dollars, won Spielberg his first Oscar nomination as best director.

The critically condemned comedy //1941// (1979) cast a brief shadow on Spielberg's career, but when his movie //Raiders of the Lost Ark// hit the theaters in 1981, all was forgiven. Centered on archaeologist Indiana Jones' adventurous search for a rare religious artifact, the action-packed film was a huge hit. (Its two sequels, //Indiana Jones and the Temple of Doom// [1984] and //Indiana Jones and the Last Crusade// [1989], did just as well financially, but not critically.) Next came what Spielberg has called his "most personal film," //E.T.: The Extra-Terrestrial// (1982). Centered on a gentle alien and a boy who befriends and tries to help him, the movie topped even //Jaws// as the largest-grossing movie of all time and earned Spielberg another Academy Award nomination.

In 1984, Spielberg decided to establish a production company called Amblin Entertainment to help him handle all the projects that were coming in. He began concentrating his energy on producing films; he no longer had the time to direct all the ideas he had. In 1984, Amblin produced its first movie, //Gremlins//, followed by the highly popular //Back to the Future// (1985), //Young Sherlock Holmes// (1985), //The Money Pit// (1986), the animated //Who Framed Roger Rabbit?// (1988), and //Joe Versus the Volcano// (1990). In 1988, however, Spielberg realized that he was sick of producing, calling his most recent films "candy" that could "ruin his health."

Meanwhile, directors [|Martin Scorsese] and [|George Lucas] had been trying to get Spielberg to produce films for a more mature audience. Toward that end, Spielberg returned to directing in 1985 with the film //The Color Purple//. Despite the movie's 11 Oscar nominations—although not for best director—some critics slammed it, saying Spielberg had reduced [|Alice Walker's] [|Pulitzer Prize]-winning novel about the poverty and brutality of a black woman's existence into a series of pretty, Hollywood-formulated set pieces. Spielberg's next efforts at "adult" movies, //Empire of the Sun// (1987), //Always// (1989), and //Hook// (1991), were among his less successful films.

With the release of the film //Jurassic Park// in 1993, Spielberg's career took off again. It was so popular that it ousted //E.T.// from its reign as biggest-grossing movie ever. Millions bought the movie tie-ins and souvenir items, and dinosaurs became the best-selling children's toys. However, reviewers almost uniformly criticized the film for being only an amalgam of all of Spielberg's successful past movie ideas and for being coldly calculated to make as much money as possible.

The director countered that criticism with his next film, //Schindler's List// (1993), the true story of a German industrialist who spent much of his own fortune saving Jews from Nazi [|concentration camps] during [|World War II]. The black-and-white film elicited warm praise from critics and from Spielberg's fans. //Schindler's List// received 11 Academy Award nominations and won seven, including Spielberg's first Oscar for best director. Spielberg also won the Directors Guild of America award for the movie. He established several charitable organizations for Jews with the proceeds from the film.

In the 1990s, Spielberg's endeavors also extended into television, where his productions have included the //Amazing Stories// series, //seaQuest DSV//, and //Tiny Toon Adventures//, although most of his television efforts have not been highly successful. In 1994, Spielberg and multimedia kings Jeffrey Katzenberg and David Geffen founded a new studio called Dreamworks SKG.

In 1998, Spielberg directed the boldly realistic and critically acclaimed World War II movie //Saving Private Ryan//, once again demonstrating his skill at presenting serious subject matter. Although //Saving Private Ryan—//which teamed Spielberg with [|Tom Hanks], one of Hollywood's most respected actors—lost in the race for best picture, it did win five other [|Academy Awards], including best director, and was a huge critical and box office success. Spielberg raised the bar significantly in portraying the reality of the battlefield with a graphic 24-minute opening sequence on the [|D-Day] landing.

Spielberg returned to science fiction with //A.I.,// which was released in 2001 and tells the story of a young boy who is actually a sentient robot designed for the comfort and convenience of childless adults. Based on a short story by British writer Brian Aldiss, the project was originally conceived by [|Stanley Kubrick], to whom the film is dedicated. He followed //A.I.// with two popular films in 2002—//Minority Report,// starring [|Tom Cruise], and //Catch Me If You Can,// starring Leonardo DiCaprio and reteaming Spielberg with Hanks. Once again working with Hanks, Spielberg directed //The Terminal// (2004) and the widely panned //War of the Worlds// (2005) starring Cruise. Next, Spielberg produced and directed the critically acclaimed //Munich// (2005), which retold the Munich Olympic murders of 1972; once again, he received an Oscar nod with nominations for both best director and best picture.

Pairing with director [|Clint Eastwood], Spielberg coproduced two films about the [|Battle of Iwo Jima]: the first release, //Flags of Our Fathers// (2006), presented the battle from the U.S. military's perspective; the second release, //Letters from Iwo Jima// (2006), presented the battle from the Japanese military's viewpoint, which included using the Japanese language with English subtitles. Both productions were well received and //Letters from Iwo Jima// earned an Academy Award nomination for best picture. Most recently, Spielberg executive produced the television series //On the Lot//, which pitted film school student's movies against one another to win a studio contract, and the movie //Transformers// (2007).

[|back to top] ID: 247908

"Steven Spielberg." //American History//. ABC-CLIO, 2010. Web. 2 Mar. 2010. .
lllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllll

=George Lucas=

Individual
Best known for writing and directing the movie //Star Wars,// George Lucas exerted an influence in Hollywood film production perhaps second only to [|Walt Disney] through his Lucasfilm company.

Born on May 14, 1944 in Modesto, [|California], Lucas grew up on his father's walnut farm. At first, he aspired to race cars and as a teenager, often sped along the local highways in a modified Fiat. He changed his goal just days before his high school graduation, however, when he wrapped his car around a walnut tree and landed unconscious in the hospital. He later said:

The accident made me more aware of myself and my feelings. I began to trust my instincts. I had the feeling that I should go to college, and I did. I had the same feeling later that I should go to film school, even though everybody thought I was nuts. I had the same feeling when I made //Star Wars//, when even my friends told me I was crazy. These are things that have to be done, and I feel as if I have to do them. After attending Modesto Junior College, Lucas, who had barely passed high school, gained admission to the University of Southern California's film program. There, he impressed director Roger Corman, who allowed him to shoot a short documentary about another director, [|Francis Ford Coppola]. Coppola, in turn, helped get Lucas a contract with [|Warner Brothers] so the young man could write a plot for a science fiction movie, //THX 1138.//

Released in 1971, //THX 1138// received generally poor reviews, yet some saw potential in Lucas, and //Newsweek// magazine called the movie "an extremely professional first film." Lucas shot his second movie, //American Graffiti,// which he wrote and directed, on a low budget, but critics praised it, and in 1973, audiences flocked to see it. At that point, Twentieth Century Fox signed Lucas to shoot //Star Wars//. He wrote the screenplay over three years and shot the movie in Tunisia and England. A spectacular presentation, its advanced technology and modernized Flash Gordon-type science fiction story made it an instant hit, breaking all box office records in 1977 and becoming a classic. He followed //Star Wars// with the equally successful //The Empire Strikes Back.//

Following his success, Lucas founded Lucasfilm, a private company with himself as chairman of the board. Located in San Rafael, California, Lucasfilm consisted in the early 1980s of five stucco buildings that housed, in addition to a studio, Industrial Light & Magic (ILM), an optical research lab. At the time, Lucasfilm had average annual revenues of $26 million, and Lucas was worth $60 million. He expressed his preference, however, for directing and editing ** movies **, rather than executive work. "Running the company to me is like mowing the lawn," he said. "It has to be done; I semi-enjoy it, once in a while."

A private person, Lucas avoided the Hollywood limelight and preferred staying home with his wife Marcia, a film editor, and their adopted daughter. That relationship unraveled in 1983, however, and the couple divorced.

For Lucas, professional success came again with the third movie in the //Star Wars// trilogy, //Return of the Jedi,// which in its first three weeks at the theaters in 1983 grossed $100 million. At the same time, Lucas guided ILM into new technology, creating special effects for movies such as //E.T., Poltergeist,// //Raiders of the Lost Ark,// and two other [|Indiana] Jones movies that he produced (with [|Steven Spielberg] directing)//.//

Over the next decade, Lucas poured $200 million into ILM so his technology could revolutionize Hollywood. //Forbes// magazine claimed in 1996 that ILM was changing the movie business "as radically as did talkies and Technicolor." ILM special effects appeared in //Forrest Gump// and //Jurassic Park,// among other motion pictures. Lucas had developed a way for computers to digitally alter film, after which a technician using software could transpose and create images within the film's setting.

Thus, for example, in //The American President,// Lucas digitally recreated the [|House of Representatives], wrapping it around the movie's star, Michael Douglas, who in the role of president, presented his [|State of the Union] speech there. Without Lucas' technology, the director would have been required to film on location, a difficult task given the public building involved. In another movie, //Cliffhanger,// star Sylvester Stallone performed apparently death-defying stunts along the sides of mountains, but he was actually suspended from wires and protected by nets that, on the film, Lucas digitally erased.

Throughout the 1980s and 1990s, Lucas served as executive producer on numerous projects, including the films //Willow// and //Tucker: The Man and his Dream// and the television series //The Young Indian Jones Chronicles.// He went back behind the camera to direct 1999's //Star Wars: Episode I - The Phantom Menace,// which he also wrote and executive produced. By the end of the year, //The Phantom Menace// had earned $922 million worldwide, making it the second-highest grossing film in history.Similarly, //Star Wars: Episode II - Attack of the Clones// (2002) has earned $648 million worldwide, and //Star Wars: Episode III - Revenge of the Sith// is expected to gross equally high figures when it is released in 2004.

In 2001, //Forbes// magazine reported in its annual issue on the 400 richest people in America that Lucas' fortune is estimated at $3 billion. Lucas' success can best be explained by his observation:

I'm very aware as a creative person that those who control the means of production control the creative vision. It's not a matter of going down and saying, 'You're going to let me have the final cut.' Because no matter what you do in a contract, they will go around it. Whereas if you own the cameras and you own the film, there's nothing they can do to stop you.

[|back to top] ID: 247353

Harmon, Justin, et. al. "George Lucas." //American History//. ABC-CLIO, 2010. Web. 2 Mar. 2010. .
lllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllll

=World War II ** Movies **=

Other Article
The opening of the movie //Pearl Harbor// on thousands of screens across the United States on May 25, 2001, continued the resurgence of [|World War II] films that started with [|Steven Spielberg's] //Schindler's List// in 1992. //Pearl Harbor// grossed $75 million over Memorial Day weekend and reached the $200 million mark by the end of its theatrical run, proving once again that World War II ** movies ** often translate into big box office.

From the moment the Japanese attacked [|Pearl Harbor] on December 7, 1941, Hollywood has churned out numerous World War II movies, many of which have garnered both critical acclaim and terrific box office results. Eight World War II films have won an Academy Award for best picture, and a ninth, Spielberg's //Saving Private Ryan,// would have won in 1999 if not for a very expensive marketing campaign launched by Miramax Films for //Shakespeare in Love.// Two episodes of the seven-part documentary series //Why We Fight,// directed by [|Frank Capra], also won Oscars for best documentary.

When the United States entered World War II, [|U.S. Army] chief of staff [|George C. Marshall] wanted to make sure both U.S. troops and the U.S. citizenry supported the war effort. As part of the government's aggressive propaganda campaign to galvanize public support, Marshall chose Capra, the director of several Hollywood movies (including //It's a Wonderful Life//) to direct //Why We Fight.// Capra, at the time a major in the U.S. Army Signal Corps, worked quickly, and the first installment in the //Why We Fight// series, //Prelude to War,// won the 1942 Academy Award for best documentary.

While in production on //Why We Fight,// which was required viewing for all new enlistees, Capra received help from such legendary Hollywood figures as John and Walter Huston and [|Walt Disney] and had access to the state-of-the-art facilities at [|Metro-Goldwyn-Mayer], Paramount Pictures, and 20th Century Fox. Using authentic newsreel footage and captured enemy film, Capra directed six more episodes: //The Nazi Strike// (1943), //Divide and Conquer// (1943), //Battle of Britain// (1943), //Battle of China// (1944), //Battle of Russia// (1944), and //War Comes to America// (1945). //Battle of Russia// also received an Academy Award for best documentary.

In the meantime, the rest of Hollywood was working to produce World War II features for an eager audience. The Battle of Wake Island, which began shortly after the attack on Pearl Harbor, was quickly dramatized by Hollywood and released as //Wake Island// in 1942. //Mrs. Miniver// and //Casablanca,// the first of eight World War II movies to win an Academy Award for best picture, came out the same year. Throughout the rest of the war years, a flood of World War II movies hit movie theaters that featured such major stars as [|Cary Grant] (//Destination Tokyo//), [|Humphrey Bogart] (//Casablanca, Sahara//), [|John Wayne] (//Back to Bataan,The Fighting Seabees//), and Errol Flynn (//Objective, Burma!//). [|Spencer Tracy] starred in the 1944 film //Thirty Seconds Over Tokyo// about the Doolittle raid, an event that is also featured prominently at the end of //Pearl Harbor.//

In 1946, a year after the war ended, //The Best Years of Our Lives,// a story about World War II veterans returning home, won an Academy Award for best picture. In 1949, a trio of classic World War II movies made it to theaters, including //Battleground,// which realistically depicts the [|Battle of the Bulge]; //Sands of Iwo Jima,// featuring one of Wayne's best performances; and //Twelve O'Clock High,// considered by many to be the best U.S. film about the air war.

Some of the best World War II movies were released during the 1950s, most significantly //From Here to Eternity,// which was adapted from the bestselling novel by James Jones and won eight [|Academy Awards], including best picture in 1953. The movie focuses on the lives of U.S. military personnel stationed in [|Hawaii] just before the attack on Pearl Harbor. [|Frank Sinatra] won a best supporting actor Oscar for his role in the film.

//Stalag 17,// directed by the legendary [|Billy Wilder], was also released in 1953 and resulted in a best actor Oscar for William Holden. The film brilliantly combines suspense and comedy to tell the story of a World War II prison camp. [|Audie Murphy], the most decorated U.S. soldier in World War II, played himself in the 1955 movie //To Hell and Back.// The year 1957 saw the release of David Lean's epic //The Bridge On the River Kwai,// which took home the best picture Oscar that year. The dramatic story centers around the construction of a bridge by American and British prisoners of war under the supervision of a Japanese colonel.

Many of the World War II movies of the 1960s were known for their epic length and star-studded casts. //The Guns of Navarone// (1961), about an international task force sent to Greece to destroy two huge German gun batteries, stars [|Gregory Peck], David Niven, and [|Anthony Quinn]. The year 1963 saw the debut of two of the best of these star-studded blockbusters: //The Longest Day// and //The Great Escape. The Longest Day// is a wonderful recreation of the Allied invasion of Normandy on D-Day and features Wayne, [|Robert Mitchum], [|Henry Fonda], and Sean Connery. //The Great Escape,// which is anchored by [|Steve McQueen], James Coburn, James Garner, and Charles Bronson, is based on the true story of the biggest Allied prison break of World War II.

//The Battle of the Bulge,// perhaps the most disappointing of the star-studded World War II blockbusters, was released in 1965. Two years later, perennial fan-favorite //The Dirty Dozen// hit the big screen. Loaded with such stars as Lee Marvin, Bronson, Donald Sutherland, Ernest Borgnine, and ex-football star [|Jim Brown], the movie tells the story of 12 convicted murderers trained by a tough U.S. Army major to carry out a secret mission. In 1969, [|Clint Eastwood] got into the act in //Where Eagles Dare,// in which he and Richard Burton rescue a U.S. general from a heavily fortified German castle.

The year 1970 was another memorable year for World War II movies thanks largely to //Patton,// a first-rate film biography about the controversial U.S. general [|George S. Patton] that won the Academy Award for best picture. The title character was played brilliantly by [|George C. Scott], who won a best acting Oscar for his efforts but refused to accept it, denouncing the Academy Awards as a "self-serving meat parade." //Tora! Tora! Tora!,// a joint U.S.-Japanese venture that was released the same year, tells the story of the events surrounding the Pearl Harbor attack from both an American and Japanese point of view.

The mid-1970s saw the return of the star-studded blockbuster as the [|Battle of Midway] was given the full Hollywood treatment. //Midway,// which features Fonda, Mitchum, and [|Charlton Heston], was marred by an intrusive romantic subplot, the same complaint that was made about //Pearl Harbor// by several critics. The following year, a parade of stars could be seen in //A Bridge Too Far,// which showcased [|Robert Redford], Laurence Olivier, Michael Caine, James Caan, and Connery.

The 1980s got off to a strong start with //The Big Red One,// a superb World War II movie that features Marvin as a grizzled sergeant leading a platoon through the battlefields of North Africa and Europe. However, Hollywood had little else to offer for the rest of the decade as [|Vietnam War] movies like //Platoon// took center stage. Spielberg did direct his first World War II movie in 1987, but //Empire of the Sun// is not up to par with //Schindler's List// or //Saving Private Ryan//.

The 1990s started quietly for World War II movies, with only the entertaining //Memphis Belle// and the intelligent antiwar drama //A Midnight Clear// worth mentioning. In 1993, however, //Schindler's List// put World War II movies back on the frontburner in Hollywood. Spielberg's searing [|Holocaust] masterpiece brought in nearly $100 million at the box office and took home eight Academy Awards, despite its difficult subject matter and black-and-white photography. Two years later, Home Box Office ([|HBO]) produced //The Tuskegee Airmen,// a film about the [|Tuskegee Airmen], the first African-American World War II fighter squadron to face aerial combat. In 1997, another World War II drama, //The English Patient,// also performed well at the box office and won the Academy Award for best picture.

The year 1998 was a milestone year for World War II movies, with three films about the war nominated for best picture. Along with //Saving Private Ryan,// the other nominees were //A Thin Red Line,// a visually compelling account of the [|Battle of Guadalcanal], and //Life is Beautiful,// an Italian film that features a Jewish father who uses humor to help his son survive the horrors of a [|concentration camp]. Although //Saving Private Ryan// lost in the race for best picture, it did win five other Academy Awards, including best director, and was a huge critical and box office success. Spielberg raised the bar significantly in portraying the reality of the battlefield with a graphic 24-minute opening sequence on the D-Day landing.

In 2000, the submarine action film //U-571// enjoyed moderate success at the box office, as did //Enemy at the Gates// in the spring of 2001. //Enemy at the Gates,// which depicts the duel between two snipers during the Battle of Stalingrad, was clearly influenced by //Saving Private Ryan,// particularly the graphic battle sequence near the beginning of the film. The same can be said for the gripping 40-minute sequence in //Pearl Harbor,// in which the director, Michael Bay, went to great lengths to accurately portray every detail of the Japanese surprise attack.

World War II stories continued to flood movie theaters and television sets throughout 2001. HBO aired //Conspiracy,// a movie that chillingly recreates the 1942 Wannsee Conference, in which 15 high-ranking members of the Third Reich plan the mass killing of every Jewish person on the European continent. Also in 2001, //Mr. Corelli's Mandolin,// a romance set in World War II Greece starring Nicholas Cage, was released and HBO aired "Band of Brothers," a 10-part miniseries based on Stephen Ambrose's best selling World War II book.

[|back to top] ID: 342245

"World War II Movies." //American History//. ABC-CLIO, 2010. Web. 2 Mar. 2010. .
llllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllll