Reference management. Clean and simple.

Google Scholar: the ultimate guide

How to use Google scholar: the ultimate guide

What is Google Scholar?

Why is google scholar better than google for finding research papers, the google scholar search results page, the first two lines: core bibliographic information, quick full text-access options, "cited by" count and other useful links, tips for searching google scholar, 1. google scholar searches are not case sensitive, 2. use keywords instead of full sentences, 3. use quotes to search for an exact match, 3. add the year to the search phrase to get articles published in a particular year, 4. use the side bar controls to adjust your search result, 5. use boolean operator to better control your searches, google scholar advanced search interface, customizing search preferences and options, using the "my library" feature in google scholar, the scope and limitations of google scholar, alternatives to google scholar, country-specific google scholar sites, frequently asked questions about google scholar, related articles.

Google Scholar (GS) is a free academic search engine that can be thought of as the academic version of Google. Rather than searching all of the indexed information on the web, it searches repositories of:

  • universities
  • scholarly websites

This is generally a smaller subset of the pool that Google searches. It's all done automatically, but most of the search results tend to be reliable scholarly sources.

However, Google is typically less careful about what it includes in search results than more curated, subscription-based academic databases like Scopus and Web of Science . As a result, it is important to take some time to assess the credibility of the resources linked through Google Scholar.

➡️ Take a look at our guide on the best academic databases .

Google Scholar home page

One advantage of using Google Scholar is that the interface is comforting and familiar to anyone who uses Google. This lowers the learning curve of finding scholarly information .

There are a number of useful differences from a regular Google search. Google Scholar allows you to:

  • copy a formatted citation in different styles including MLA and APA
  • export bibliographic data (BibTeX, RIS) to use with reference management software
  • explore other works have cited the listed work
  • easily find full text versions of the article

Although it is free to search in Google Scholar, most of the content is not freely available. Google does its best to find copies of restricted articles in public repositories. If you are at an academic or research institution, you can also set up a library connection that allows you to see items that are available through your institution.

The Google Scholar results page differs from the Google results page in a few key ways. The search result page is, however, different and it is worth being familiar with the different pieces of information that are shown. Let's have a look at the results for the search term "machine learning.”

Google Scholar search results page

  • The first line of each result provides the title of the document (e.g. of an article, book, chapter, or report).
  • The second line provides the bibliographic information about the document, in order: the author(s), the journal or book it appears in, the year of publication, and the publisher.

Clicking on the title link will bring you to the publisher’s page where you may be able to access more information about the document. This includes the abstract and options to download the PDF.

Google Scholar quick link to PDF

To the far right of the entry are more direct options for obtaining the full text of the document. In this example, Google has also located a publicly available PDF of the document hosted at umich.edu . Note, that it's not guaranteed that it is the version of the article that was finally published in the journal.

Google Scholar: more action links

Below the text snippet/abstract you can find a number of useful links.

  • Cited by : the cited by link will show other articles that have cited this resource. That is a super useful feature that can help you in many ways. First, it is a good way to track the more recent research that has referenced this article, and second the fact that other researches cited this document lends greater credibility to it. But be aware that there is a lag in publication type. Therefore, an article published in 2017 will not have an extensive number of cited by results. It takes a minimum of 6 months for most articles to get published, so even if an article was using the source, the more recent article has not been published yet.
  • Versions : this link will display other versions of the article or other databases where the article may be found, some of which may offer free access to the article.
  • Quotation mark icon : this will display a popup with commonly used citation formats such as MLA, APA, Chicago, Harvard, and Vancouver that may be copied and pasted. Note, however, that the Google Scholar citation data is sometimes incomplete and so it is often a good idea to check this data at the source. The "cite" popup also includes links for exporting the citation data as BibTeX or RIS files that any major reference manager can import.

Google Scholar citation panel

Pro tip: Use a reference manager like Paperpile to keep track of all your sources. Paperpile integrates with Google Scholar and many popular academic research engines and databases, so you can save references and PDFs directly to your library using the Paperpile buttons and later cite them in thousands of citation styles:

presentation on google scholar

Although Google Scholar limits each search to a maximum of 1,000 results , it's still too much to explore, and you need an effective way of locating the relevant articles. Here’s a list of pro tips that will help you save time and search more effectively.

You don’t need to worry about case sensitivity when you’re using Google scholar. In other words, a search for "Machine Learning" will produce the same results as a search for "machine learning.”

Let's say your research topic is about self driving cars. For a regular Google search we might enter something like " what is the current state of the technology used for self driving cars ". In Google Scholar, you will see less than ideal results for this query .

The trick is to build a list of keywords and perform searches for them like self-driving cars, autonomous vehicles, or driverless cars. Google Scholar will assist you on that: if you start typing in the search field you will see related queries suggested by Scholar!

If you put your search phrase into quotes you can search for exact matches of that phrase in the title and the body text of the document. Without quotes, Google Scholar will treat each word separately.

This means that if you search national parks , the words will not necessarily appear together. Grouped words and exact phrases should be enclosed in quotation marks.

A search using “self-driving cars 2015,” for example, will return articles or books published in 2015.

Using the options in the left hand panel you can further restrict the search results by limiting the years covered by the search, the inclusion or exclude of patents, and you can sort the results by relevance or by date.

Searches are not case sensitive, however, there are a number of Boolean operators you can use to control the search and these must be capitalized.

  • AND requires both of the words or phrases on either side to be somewhere in the record.
  • NOT can be placed in front of a word or phrases to exclude results which include them.
  • OR will give equal weight to results which match just one of the words or phrases on either side.

➡️ Read more about how to efficiently search online databases for academic research .

In case you got overwhelmed by the above options, here’s some illustrative examples:

Tip: Use the advanced search features in Google Scholar to narrow down your search results.

You can gain even more fine-grained control over your search by using the advanced search feature. This feature is available by clicking on the hamburger menu in the upper left and selecting the "Advanced search" menu item.

Google Scholar advanced search

Adjusting the Google Scholar settings is not necessary for getting good results, but offers some additional customization, including the ability to enable the above-mentioned library integrations.

The settings menu is found in the hamburger menu located in the top left of the Google Scholar page. The settings are divided into five sections:

  • Collections to search: by default Google scholar searches articles and includes patents, but this default can be changed if you are not interested in patents or if you wish to search case law instead.
  • Bibliographic manager: you can export relevant citation data via the “Bibliography manager” subsection.
  • Languages: if you wish for results to return only articles written in a specific subset of languages, you can define that here.
  • Library links: as noted, Google Scholar allows you to get the Full Text of articles through your institution’s subscriptions, where available. Search for, and add, your institution here to have the relevant link included in your search results.
  • Button: the Scholar Button is a Chrome extension which adds a dropdown search box to your toolbar. This allows you to search Google Scholar from any website. Moreover, if you have any text selected on the page and then click the button it will display results from a search on those words when clicked.

When signed in, Google Scholar adds some simple tools for keeping track of and organizing the articles you find. These can be useful if you are not using a full academic reference manager.

All the search results include a “save” button at the end of the bottom row of links, clicking this will add it to your "My Library".

To help you provide some structure, you can create and apply labels to the items in your library. Appended labels will appear at the end of the article titles. For example, the following article has been assigned a “RNA” label:

Google Scholar  my library entry with label

Within your Google Scholar library, you can also edit the metadata associated with titles. This will often be necessary as Google Scholar citation data is often faulty.

There is no official statement about how big the Scholar search index is, but unofficial estimates are in the range of about 160 million , and it is supposed to continue to grow by several million each year.

Yet, Google Scholar does not return all resources that you may get in search at you local library catalog. For example, a library database could return podcasts, videos, articles, statistics, or special collections. For now, Google Scholar has only the following publication types:

  • Journal articles : articles published in journals. It's a mixture of articles from peer reviewed journals, predatory journals and pre-print archives.
  • Books : links to the Google limited version of the text, when possible.
  • Book chapters : chapters within a book, sometimes they are also electronically available.
  • Book reviews : reviews of books, but it is not always apparent that it is a review from the search result.
  • Conference proceedings : papers written as part of a conference, typically used as part of presentation at the conference.
  • Court opinions .
  • Patents : Google Scholar only searches patents if the option is selected in the search settings described above.

The information in Google Scholar is not cataloged by professionals. The quality of the metadata will depend heavily on the source that Google Scholar is pulling the information from. This is a much different process to how information is collected and indexed in scholarly databases such as Scopus or Web of Science .

➡️ Visit our list of the best academic databases .

Google Scholar is by far the most frequently used academic search engine , but it is not the only one. Other academic search engines include:

  • Science.gov
  • Semantic Scholar
  • scholar.google.fr : Sur les épaules d'un géant
  • scholar.google.es (Google Académico): A hombros de gigantes
  • scholar.google.pt (Google Académico): Sobre os ombros de gigantes
  • scholar.google.de : Auf den Schultern von Riesen

➡️ Once you’ve found some research, it’s time to read it. Take a look at our guide on how to read a scientific paper .

No. Google Scholar is a bibliographic search engine rather than a bibliographic database. In order to qualify as a database Google Scholar would need to have stable identifiers for its records.

No. Google Scholar is an academic search engine, but the records found in Google Scholar are scholarly sources.

No. Google Scholar collects research papers from all over the web, including grey literature and non-peer reviewed papers and reports.

Google Scholar does not provide any full text content itself, but links to the full text article on the publisher page, which can either be open access or paywalled content. Google Scholar tries to provide links to free versions, when possible.

The easiest way to access Google scholar is by using The Google Scholar Button. This is a browser extension that allows you easily access Google Scholar from any web page. You can install it from the Chrome Webstore .

presentation on google scholar

Article Contents

Body language and movement, verbal delivery.

  • < Previous

Effective presentation skills

  • Article contents
  • Figures & tables
  • Supplementary Data

Robert Dolan, Effective presentation skills, FEMS Microbiology Letters , Volume 364, Issue 24, December 2017, fnx235, https://doi.org/10.1093/femsle/fnx235

  • Permissions Icon Permissions

Most PhD's will have a presentation component during the interview process, as well as presenting their work at conferences. This article will provide guidance on how to develop relevant content and effectively deliver it to your audience.

Most organizations list communication skills as one of their most critical issues…and presentation skills are a large component of communications. Presentation skills are crucial to almost every aspect of academic/business life, from meetings, interviews and conferences to trade shows and job fairs. Often times, leadership and presentation skills go hand in hand. NACE Survey 2016 - Ability to communicate verbally (internally and externally) ranked 4.63/5.0 and was the #1 skill employers want. The information provided in this article is designed to provide tips and strategies for delivering an effective presentation, and one that aligns the speaker with the audience.

What type of speaker are you?

Facts and fears of public speaking.

Your blueprint for delivery.

Avoider —You do everything possible to escape from having to get in front of an audience.

Resister —You may have to speak, but you never encourage it.

Accepter —You’ll give presentations but don’t seek those opportunities. Sometimes you feel good about a presentation you gave.

Seeker —Looks for opportunities to speak. Finds the anxiety a stimulant that fuels enthusiasm during a presentation.

Public speaking can create anxiety and fear in many people. Dale Carnegie has a free e-book that provides tips and advice on how to minimize these fears www.dalecarnegie.com/Free-eBook

People are caught between their fear and the fact that many employers expect them to demonstrate good verbal communication skills.

Most interviews by PhD’s have a presentation component.

Academic interviews always have a presentation component.

If your job doesn’t demand presentation skills, odds are that you’ll need them in your next job

Develop your blueprint for delivery:

Information by itself can be boring, unless it's unique or unusual. Conveying it through stories, gestures and analogies make it interesting. A large portion of the impact of communications rests on how you look and sound, not only on what you say. Having good presentation skills allows you to make the most out of your first impression, especially at conferences and job interviews. As you plan your presentation put yourself in the shoes of the audience.

Values …What is important to them?

Needs …What information do they want?

Constraints …Understand their level of knowledge on the subject and target them appropriately.

Demographics …Size of audience and location may influence the presentation. For example, a large auditorium may be more formal and less personal than a presentation to your team or lab mates in a less formal setting.

Structure—Introduction, Content and Conclusion

Body Language and Movement

Verbal Delivery

Introduction

Build rapport with audience (easier in a smaller less formal setting).

State preference for questions—during or after?

Set stage: provide agenda, objective and intended outcomes

Introduce yourself providing your name, role and function. Let the audience know the agenda, your objectives and set their expectations. Give them a reason to listen and make an explicit benefit statement, essentially what's in it for them. Finally, let them know how you will accomplish your objective by setting the agenda and providing an outline of what will be covered.

Deliver your message logically and structured.

Use appropriate anecdotes and examples.

Illustrate and emphasize key points by using color schemes or animations.

Establish credibility, possibly citing references or publications.

Structure your presentation to maximize delivery. Deliver the main idea and communicate to the audience what your intended outcome will be. Transition well through the subject matter and move through your presentation by using phrases such as; ‘now we will review…’ or ‘if there are no more questions, we will now move onto…’ Be flexible and on course. If needed, use examples not in the presentation to emphasize a point, but don’t get side tracked. Stay on course by using phrases such as ‘let's get back to…’ Occasionally, reiterate the benefits of the content and the main idea of your presentation.

Restate the main objective and key supporting points

For Q&A: ‘Who wants more details?’ (Not, ‘any questions?’)

Prompting for questions: ‘A question I often hear is…’

Summarize the main elements of your presentation as they relate to the original objective. If applicable, highlight a key point or crucial element for the audience to take away. Signal the end is near…‘to wrap up’ or ‘to sum up’. Clearly articulate the next steps, actions or practical recommendations. Thank the audience and solicit final questions.

Your non-verbal communications are key elements of your presentation. They are composed of open body posture, eye contact, facial expressions, hand gestures, posture and space between you and the audience.

Stand firmly and move deliberately. Do not sway or shift.

Move at appropriate times during presentation (e.g. move during transitions or to emphasize a point).

Stand where you can see everyone and do not block the visuals/screen.

Decide on a resting position for hands (should feel and look comfortable).

Gestures should be natural and follow what you are saying.

Hand movement can emphasize your point.

Make gestures strong and crisp…ok to use both arms/hands.

Keep hands away from face.

When pointing to the screen, do so deliberately. Do not wave and face the audience to speak

Look at audience's faces, not above their heads.

If an interview or business meeting…look at the decision makers as well as everyone else.

Look at faces for 3–5 seconds and then move on to the next person.

Do not look away from the audience for more than 10 seconds.

Looking at a person keeps them engaged.

Looking at their faces tells you how your delivery and topic is being received by the audience. The audience's body language may show interest, acceptance, openness, boredom, hostility, disapproval and neutrality. Read the audience and adjust where and if appropriate to keep them engaged. For example, if they seem bored inject an interesting anecdote or story to trigger more interest. If they appear to disapprove, ask for questions or comments to better understand how you might adjust your delivery and content if applicable.

Use active rather than passive verbs.

Avoid technical terms, unless you know the audience is familiar with them.

Always use your own words and phrases.

Cut out jargon/slang words.

Look at your audience and use vocal techniques to catch their attention. Consider changing your pace or volume, use a longer than normal pause between key points, and change the pitch or inflection of your voice if needed. Consider taking a drink of water to force yourself to pause or slowdown. View the audience as a group of individual people, so address them as if they were a single person.

Tips for reducing anxiety

If you experience nervousness before your presentation, as most people do, consider the following.

Be Organized —Knowing that your presentation and thoughts are well organized will give you confidence.

Visualize —Imagine delivering your presentation with enthusiasm and leaving the room knowing that you did a good job.

Practice —All successful speakers rehearse their presentations. Either do it alone, with your team, or video tape yourself and review your performance after. Another tip is to make contact before your talk. If possible, speak with the audience before your presentation begins; however, not always possible with a large audience. Walk up to them and thank them in advance for inviting you to speak today.

Movement —Speakers who stand in one spot may experience tension. In order to relax, move in a purposeful manner and use upper body gestures to make points.

Eye Contact —Make your presentation a one-on-one conversation. Build rapport by making it personal and personable. Use words such as ‘ we ’ , ‘ our ’, ‘ us ’ . Eye contact helps you relax because you become less isolated from the audience.

Personal appearance

Clothes should fit well, not too tight. Consider wearing more professional business-like attire. Find two to three colors that work well for you. Conservative colors, such as black, blue, gray and brown, seem to be the safest bet when presenting or meeting someone for the first time in a professional setting. Depending upon the audience, a sport coat and well-matched dress slacks are fine. Generally, try to avoid bright reds, oranges and whites, since these tend to draw attention away from your face. Avoid jewelry that sparkles, dangles or makes noise. Use subtle accessories to compliment your outfit.

Other resources: www.toastmasters.org https://www.skillsyouneed.com/present/presentation-tips.html

https://www.ag.ndsu.edu/evaluation/documents/effective-presentations-a-toolkit-for-engaging-an-audience

Email alerts

Citing articles via.

  • Recommend to your Library
  • Journals Career Network

Affiliations

  • Online ISSN 1574-6968
  • Print ISSN 0378-1097
  • Copyright © 2024 Federation of European Microbiological Societies
  • About Oxford Academic
  • Publish journals with us
  • University press partners
  • What we publish
  • New features  
  • Open access
  • Institutional account management
  • Rights and permissions
  • Get help with access
  • Accessibility
  • Advertising
  • Media enquiries
  • Oxford University Press
  • Oxford Languages
  • University of Oxford

Oxford University Press is a department of the University of Oxford. It furthers the University's objective of excellence in research, scholarship, and education by publishing worldwide

  • Copyright © 2024 Oxford University Press
  • Cookie settings
  • Cookie policy
  • Privacy policy
  • Legal notice

This Feature Is Available To Subscribers Only

Sign In or Create an Account

This PDF is available to Subscribers Only

For full access to this pdf, sign in to an existing account, or purchase an annual subscription.

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here .

Loading metrics

Open Access

Peer-reviewed

Research Article

Does a presentation’s medium affect its message? PowerPoint, Prezi, and oral presentations

* E-mail: [email protected]

Affiliations Department of Psychology, Harvard University, Cambridge, Massachusetts, United States of America, Harvard Initiative for Learning and Teaching, Harvard University, Cambridge, Massachusetts, United States of America

ORCID logo

Affiliation Harvard Initiative for Learning and Teaching, Harvard University, Cambridge, Massachusetts, United States of America

Affiliation Minerva Schools at the Keck Graduate Institute, San Francisco, California, United States of America

  • Samuel T. Moulton, 
  • Selen Türkay, 
  • Stephen M. Kosslyn

PLOS

  • Published: July 5, 2017
  • https://doi.org/10.1371/journal.pone.0178774
  • Reader Comments

12 Oct 2017: The PLOS ONE Staff (2017) Correction: Does a presentation's medium affect its message? PowerPoint, Prezi, and oral presentations. PLOS ONE 12(10): e0186673. https://doi.org/10.1371/journal.pone.0186673 View correction

Table 1

Despite the prevalence of PowerPoint in professional and educational presentations, surprisingly little is known about how effective such presentations are. All else being equal, are PowerPoint presentations better than purely oral presentations or those that use alternative software tools? To address this question we recreated a real-world business scenario in which individuals presented to a corporate board. Participants (playing the role of the presenter) were randomly assigned to create PowerPoint, Prezi, or oral presentations, and then actually delivered the presentation live to other participants (playing the role of corporate executives). Across two experiments and on a variety of dimensions, participants evaluated PowerPoint presentations comparably to oral presentations, but evaluated Prezi presentations more favorably than both PowerPoint and oral presentations. There was some evidence that participants who viewed different types of presentations came to different conclusions about the business scenario, but no evidence that they remembered or comprehended the scenario differently. We conclude that the observed effects of presentation format are not merely the result of novelty, bias, experimenter-, or software-specific characteristics, but instead reveal a communication preference for using the panning-and-zooming animations that characterize Prezi presentations.

Citation: Moulton ST, Türkay S, Kosslyn SM (2017) Does a presentation’s medium affect its message? PowerPoint, Prezi, and oral presentations. PLoS ONE 12(7): e0178774. https://doi.org/10.1371/journal.pone.0178774

Editor: Philip Allen, University of Akron, UNITED STATES

Received: November 2, 2016; Accepted: May 18, 2017; Published: July 5, 2017

Copyright: © 2017 Moulton et al. This is an open access article distributed under the terms of the Creative Commons Attribution License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Data Availability: All data files are available from the Open Science Framework https://osf.io/fgf7c/ .

Funding: This research was supported by a grant from Prezi ( http://www.prezi.com ) to SMK. In the sponsored research agreement (which we are happy to provide) and in our conversations with Prezi leadership, they agreed to let us conduct the study as we wished and publish it no matter what the results revealed. Aside from funding the research, the only role that any employees of Prezi played was (as documented in the manuscript) 1) to provide us with a distribution list of Boston-area Prezi customers (8 of whom participated in the first experiment) and 2) as experts in Prezi, review the background questionnaire to ensure that we were accurately describing Prezi’s purported benefits and features (just as PowerPoint and oral presentation experts did the same). No employees at Prezi had any role in the study design, data collection and analysis, decision to publish, or preparation of the manuscript. None of the authors have any professional or financial connection to Prezi or personal relationships with any Prezi employees. We do not plan to conduct any follow-up research on this topic or obtain future funding from Prezi. As evident in the manuscript, we took special care not to allow bias or demand characteristics to influence this research.

Competing interests: This research was supported by a grant to SMK from Prezi ( http://www.prezi.com ), a commercial funder. This does not alter our adherence to PLOS ONE policies on sharing data and materials.

Introduction

How do the characteristics of a communication medium affect its messages? This question has been the subject of much philosophical and empirical inquiry, with some (e.g., [ 1 ]) claiming that the medium determines the message (“the medium is the message”), others (e.g., [ 2 ]) claiming that characteristics of a medium affect the message, and others claiming that the medium and message are separable (e.g.,[ 3 , 4 ]). As psychologists, we ask: What mental mechanisms underlie effective communication and how can presenters leverage these mechanisms to communicate better? These questions—at the intersection of psychology and communication practice—motivate this research.

That said, the relative efficacy of different communication media or technologies informs the primary questions of interest. If we can demonstrate that oral presentations are less or more effective than those that rely on presentation software—or that presenters who use one type of presentation software tend to be more effective than those who use another—then we advance our psychological and practical understanding of effective communication. Thus, in the tradition of use-inspired basic research [ 5 ]—and as a means to an end, rather than an end unto itself—we compare the effectiveness of three commonly-used formats for communication: oral, PowerPoint, and Prezi presentations.

We focused on presentations because they populate our academic, professional, and even personal lives in the form of public speeches, academic lectures, webinars, class presentations, wedding toasts, courtroom arguments, sermons, product demonstrations, and business presentations [ 6 – 8 ], and because basic questions remain about how to present effectively. Should we present with or without presentation software? If we should present with software, which software? We examined PowerPoint and Prezi because they are popular and psychologically interesting alternatives: Whereas PowerPoint’s linear slide format might reduce cognitive load, focus attention, and promote logical analysis, Prezi’s map-like canvas format and heavy reliance on animation (see the Background section and https://prezi.com for examples) might facilitate visuospatial processing, conceptual understanding, and narrative storytelling.

To inform the present research, we explore the methodological challenges of media research and review past research on presentation formats.

Methodological challenges of media research

To research the efficacy of different communication formats fairly and accurately, one must overcome two stubborn methodological challenges. First, because correlation is not causation and the variables that underlie media usage are heavily confounded, such research requires true experimentation. To study whether a blended learning “flipped classroom” is a more effective instructional medium than traditional lecturing, for example, researchers gain little insight by comparing outcomes for students who enroll in one type of course versus the other. To control for audience (in this case, student) self-selection effects, researchers need to 1) randomly assign audience members to different communication conditions (in this case, pedagogies) or 2) manipulate format within participants. Moreover, the same methodological controls need to be applied to presenters (in this case, instructors). Instructors who choose to teach with emerging, innovative methods probably differ in numerous other respects (e.g., motivation) from those who teach with more traditional methods. If students assigned randomly to a flipped classroom format perform better than those assigned randomly to a traditional classroom format, we risk drawing inferences about confounds instead of causes unless instructors are also assigned randomly to instructional media. To make strong, accurate inferences, therefore, researchers interested in communication must control for audience and presenter self-selection effects. Such control introduces new complexities; when randomly assigning presenters to formats, for example, one must ensure that all presenters receive sufficient training in the relevant format. Moreover, such control is often cumbersome, sometimes impractical, and occasionally unethical (e.g., randomly assigning students in actual courses to hypothetically worse instructional conditions). But there are no adequate methodological substitutes for proper experimental control.

A second thorny methodological challenge inherent in conducting media research concerns how to draw general inferences about formats instead of specific inferences about exemplars of those formats. For example, if one advertising expert is assigned randomly to design a print ad and another expert a television ad—and a hundred consumers are assigned randomly to view the television or print ad—can we actually infer anything about print versus television ads in general when the two groups of consumers behave differently? Arguably not, because such a finding is just as easily explained by other (confounding) differences between the ads or their creators (e.g., ratio of print to graphics, which sorts of people—if any—are shown, and so forth). In other words, even with proper random assignment, researchers who intend to study different forms of communication risk merely studying different instances of communication. Statistically speaking, one should assume a random not fixed effect of the communication objects of interest (e.g., presentations, lectures, advertisements). To overcome this challenge and draw generalizable inferences, one must (at the very least) sample a sufficiently large set of examples within each medium.

Research on presentation software

Methodological shortcomings..

Considerable research has been conducted on how different presentation formats (particularly PowerPoint) convey information (for review, see [ 9 ]). However, much of this research is anecdotal or based on case studies. For example, Tufte [ 10 ] claims that PowerPoint’s default settings lead presenters to create bulleted lists and vacuous graphs that abbreviate arguments and fragment thought. And Kjeldsen [ 11 ] used Al Gore’s TED talk on climate change as a positive example of how visuals can be used to effectively convey evidence and enhance verbal communication.

Research that goes beyond mere anecdote or case study is plagued by the aforementioned methodological shortcomings: failure to control for audience self-selection effects (71% of studies), failure to control for presenter self-selection effects (100% of studies), and a problematic assumption of fixed effects across content and presenters (91% of studies). As is evident in Table 1 , no studies overcame two of these shortcomings, let alone all three. For example, in one of the most heavily-cited publications on this topic Szabo and Hasting [ 12 ] investigated the efficacy of PowerPoint in undergraduate education. In the first study, they examined whether students who received lectures with PowerPoint performed better on a test than students who received traditional lectures. Students were not assigned randomly to lecture conditions, however; rather, the comparison was across time, between two cohorts of students enrolled in different iterations of the same course. Any observed outcome difference could have been caused by student or instructor variables (e.g., preparedness), not lecture format. The fact that no such differences were found does not obviate this concern: Such differences may in fact have been present, but were overshadowed by confounding characteristics of students or instructors. In the second study, the authors varied presentation format within the same cohort of students, but confounded format with order, time, content, and performance measure: student performance was compared between lectures on different days, on different topics, and using different tests. As the authors themselves note, the observed differences may have had nothing to do with PowerPoint. In the third study, they counterbalanced lecture order and content; some students received a PowerPoint lecture first and others a traditional lecture first, and the same topics were presented in both formats. However, students were assigned to conditions based on their course enrollment, not randomly, but more importantly the study included only four presentations, all by one presenter. Any advantages of the two PowerPoint lectures (none were found) might have been particular to those instances or that presenter and not representative of the format more generally.

thumbnail

  • PPT PowerPoint slide
  • PNG larger image
  • TIFF original image

https://doi.org/10.1371/journal.pone.0178774.t001

Most studies—even those that control experimentally for audience self-selection—relied on only a single self-selected presenter, and some relied on only one presentation per format. In one study ([ 13 ]: Experiment 1), for example, one of the authors varied the format of his lecture instruction randomly across the semester, using transparences or PowerPoint slides. In another study [ 14 ], students who were enrolled in one of the authors’ courses were assigned randomly to a PowerPoint or Prezi e-lecture that contained identical audio narration and written text. In a third study [ 15 ], one of the researchers gave the same lecture over the course of the year to rotating medical students, using PowerPoint on odd months and overhead slides on even months. What reason is there to think that we can make general claims about presentation format based on studies of single lectures or single presenters? That is, how can we reasonably assume fixed as opposed to random effects? If the use of presentation software does meaningfully influence student learning or experience, surely that effect is not constant across all presenters or presentations—some instructors use it more effectively than others, and within any format some presentations are more effective than others (see [ 16 ]). And how can we assume that presenters who select both the content and format of their presentations are not designing them in ways that favor one format over another?

Research on the efficacy of presentation software has numerous other flaws, most notably the failure to control for experimenter effects or demand characteristics. In 82% of studies we identified, for example, the researchers investigated their own instruction and studied their own students. It is difficult to imagine that one would make these instructional and research efforts (e.g., creating new course material, conducting a field experiment) without a strong belief in the efficacy of one format over the other, and it is plausible (if not likely) that such beliefs would influence students or confound instructional format with instructional effort and enthusiasm.

Another common issue is the confounding of lecture format with access to study materials—in studies that contrast PowerPoint with traditional lecturing (e.g., [ 17 – 19 ]), students in the PowerPoint condition (but not the control condition) sometimes have access to PowerPoint slides as study material. This access could bias student motivation, behavior (e.g., attendance), course satisfaction, and performance (see [ 20 ]).

PowerPoint: Performance, perception, and persuasion.

Despite their methodological shortcomings, what are the findings of this research literature? The majority of studies examined the use of PowerPoint in higher education and measured both objective and subjective outcomes (see Table 1 ). They typically involved students enrolled in one or more of the researchers’ courses, and contrasted the efficacy of lectures (or whole lecture courses) that used PowerPoint with those that used a more traditional technology (e.g., blackboards, overhead projectors). In terms of student performance, their findings were notably mixed: Of the 28 studies we identified, 17 found no effect of PowerPoint lectures relative to traditional lectures ([ 12 ]: Experiments 1,3; [ 13 , 15 , 21 – 33 ]), 9 found a performance benefit of PowerPoint over traditional instruction ([ 12 ]: Experiment 2; [ 17 – 19 , 34 – 38 ]), and 2 found a performance benefit of traditional over PowerPoint instruction [ 39 , 40 ].

There is near consensus in the literature, however, when it comes student perception: Of the 26 studies we identified, 21 found that students preferred PowerPoint over traditional instruction ([ 12 ]: Experiment 1; [ 13 , 17 – 19 , 21 , 23 , 25 , 26 , 28 , 29 , 31 – 33 , 35 , 39 , 41 – 45 ]), 2 found that students preferred traditional over PowerPoint instruction [ 40 , 46 ], and 3 other studies found no preference for one or the other formats [ 15 , 22 , 37 ]. As one example, Tang and Austin [ 45 ] surveyed 215 undergraduates in business courses about their general perceptions of different lecture formats; on measures of enjoyment, learning, motivation, and career relevance, they found that students rated lectures with PowerPoint slides more favorably than lectures with overheads or without visual aids. An additional 7 studies did not contrast student perceptions of PowerPoint with another technology—they simply surveyed students about PowerPoint; these studies all found that students had, on average, favorable impressions of PowerPoint-based instruction [ 36 , 47 – 52 ].

In addition to these studies of how presentation software impacts student performance and perception, two studies examined PowerPoint‘s impact on audience persuasion. Guadagno, Sundie, Hardison, and Cialdini [ 53 ] argue that we heuristically use a presentation’s format to evaluate its content, particularly when we lack the expertise to evaluate the content on its merits. To test this hypothesis, they presented undergraduates with key statistics about a university football recruit and asked them to evaluate the recruit’s career prospects. The same statistics were presented in one of three formats: a written summary, a graphical summary via printed-out PowerPoint slides, or a graphical summary via animated PowerPoint slides (self-advanced by the participant). Participants shown the computer-based PowerPoint presentation tended to rate the recruit more positively than other participants, and there was some evidence that this effect was more pronounced for football novices than for experts. The findings of this study suggest that some presentation formats may be more persuasive than others, perhaps because audience members conflate a sophisticated medium with a sophisticated message.

In the second study to examine the impact of PowerPoint on persuasion, Park and Feigenson [ 54 ] examined the impact of video-recorded presentations on mock juror decision-making. Participants were more persuaded by attorneys on either side of a liability case when the attorney used PowerPoint slides as opposed to merely oral argument. They also remembered more details from PowerPoint than oral presentations, and evaluated both attorneys as more persuasive, competent, credible, and prepared when they presented with PowerPoint. Based on mediation analyses, the researchers argue that the decision-making benefit of PowerPoint results from both deliberative and heuristic processing (“slow” and “fast” thinking, respectively, see [ 55 ]).

Both of these studies, however, share the methodological limitations of the educational research on PowerPoint. The first study [ 53 ] used only one PowerPoint presentation, and the second [ 54 ] used only two. The presentations used were not selected at random from a larger stimulus pool but instead were created by researchers who hypothesized that PowerPoint would enhance presentations. But even if the presentations had been sampled randomly, the sample is too small to allow one to generalize to a broader population. In studying performance, perception, or persuasion, one cannot reasonably assume that all presentation effects are equal.

Prezi: A zoomable user interface.

Released in 2009, Prezi has received generally favorable reviews by researchers, educators, and professional critics [ 56 – 60 ]. With a purported 75 million users worldwide, it is increasingly popular but still an order of magnitude less so than PowerPoint (with as many as one billion users; [ 61 ]). Like PowerPoint and other slideware, Prezi allows users to arrange images, graphics, text, audio, video and animations, and to present them alongside aural narration to an in-person or remote audience. In contrast to PowerPoint and other slideware in which users create presentations as a deck of slides, Prezi users create presentations on a single visuospatial canvas. In this regard, Prezi is much like a blackboard and chalk. But unlike a physical blackboard, the Prezi canvas is infinite (cf. [ 62 ]) and zoomable: in designing presentations, users can infinitely expand the size of their canvas and can zoom in or out. When presenting, users define paths to navigate their audience through the map-like presentation, zooming and panning from a fixed-angle overhead view.

Like Google Maps or modern touchscreens, Prezi is an example of what scholars of human-computer interaction label a zoomable user interface (ZUI). These interfaces are defined by two features: They present information in a theoretically infinite two-dimensional space (i.e., an infinite canvas) and they enable users to animate this virtual space through panning and zooming. Some of the original ZUIs were used to visualize history, navigate file systems, browse images, and—in the Prezi predecessor CounterPoint—create presentations [ 63 , 64 ].

As communication and visualization tools, ZUIs in general and Prezi in particular are interesting psychologically for several reasons. First, they may take advantage of our mental and neural architecture, specifically the fact that we process information through dissociable visual and spatial systems. Whereas the so-called “ventral” visual system in the brain processes information such as shape and color, the “dorsal” spatial system processes information such as location and distance [ 65 – 68 ]. When working in concert, these systems result in vastly better memory and comprehension than when they work in isolation. For example, in the classic “method of loci” individuals visualize objects in specific locations; when later trying to recall the objects, they visualize navigating through the space, “seeing” each object in turn. This method typically doubles retention, compared to other ways of trying to memorize objects [ 69 , 70 ]. Similarly, in research on note-taking, students learned more when they used spatial methods than when they used linear methods (e.g., [ 71 ]). Mayer’s multimedia learning principles and evidence in their favor also highlight the importance of spatial contiguity [ 72 ].

Thus, by encouraging users to visualize and process information spatially, ZUIs such as Prezi may confer an advantage over traditional tools such as PowerPoint that do not encourage such visuospatial integration. As Good and Bederson [ 64 ] write: “Because they employ a metaphor based on physical space and navigation, ZUIs offer an additional avenue for exploring the utilization of human spatial abilities during a presentation.”

Furthermore, ZUIs may encourage a particularly efficacious type of spatial processing, namely graphical processing. In graphical processing, digital objects (or groups of objects) are not just arranged in space, they are arranged or connected in a way makes their interrelationships explicit. Randomly placing animal stickers on a blank page, for example, engages mere spatial processing; drawing connecting lines between animals of the same genus or arranging the animals into a phylogenetic tree, however, engages graphical processing. Because ZUIs force users to “see the big picture,” they may prompt deeper processing than software that segments content into separate spatial canvases. By facilitating such processing, ZUIs may leverage the same learning benefits of concept maps and other graphical organizers, which have been studied extensively. For example, in their meta-analysis of the use of concept maps in education, Nesbit and Adesope [ 73 ] found that these graphical representations (especially when animated) were more effective than texts, lists, and outlines. By requiring one to organize the whole presentation on a single canvas instead of a slide deck, therefore, Prezi may prompt presenters (and their audiences) to connect component ideas with each other, contextualize them in a larger narrative, and remember, understand, and appreciate this larger narrative. Slideware, on the other hand, may do just the opposite:

PowerPoint favours information that can be displayed on a single projected 4:3 rectangle. Knowledge that requires more space is disadvantaged … How to include a story on a slide? Distributing the associated text over several slides literally breaks it into fragments, disturbing its natural cohesion and thus coherence … PowerPoint renders obsolete some complex narrative and data forms in favour of those that are easily abbreviated or otherwise lend themselves to display on a series of slides [ 74 ] (p399)

Of course these arguments are speculative, and one can also speculate on the psychological costs of ZUI or benefits of standard slideware. Perhaps PowerPoint does confer some of same spatial processing benefits of Prezi—after all, slides are spatial canvases, and they must be arranged to form a narrative—but in a way that better manages the limited attentional resources of the presenter or audience. Our point here is simply that Prezi, as a ZUI presentation tool, offers a psychologically interesting alternative to standard deck-based slideware, with a range of possible advantages that could be explored empirically to discover the psychological mechanisms of effective communication.

Like the PowerPoint literature, most of the published literature on Prezi is limited to observational reports or case studies. Brock and Brodahl [ 75 ] evaluated Prezi favorably based on their review and students’ ratings of course presentations. Conboy, Fletcher, Russell, and Wilson [ 76 ] interviewed 6 undergraduates and 3 staff members about their experiences with Prezi in lecture instruction and reported generally positive experiences. Masood and Othman [ 77 ] measured the eye movements and subjective judgments of ten participants who viewed a single Prezi presentation; participants attended to the presentation’s text more than to its other components (e.g., images, headings), and favorably judged the presentation. Ballentine [ 78 ] assigned students to use Prezi to design text adventure games and reported benefits of using the medium. Two other studies [ 79 , 80 ] surveyed college students about their course experiences with Prezi, and both reported similarly positive perceptions.

All of these studies, however, suffer from major demand characteristics, due to the fact that the researchers observed or asked leading questions of their own students about their own instruction (e.g., “Do you find lectures delivered with Prezi more engaging then[sic] other lectures?”, from [ 79 ]). Moreover, all suffer from the methodological limitations discussed earlier.

Other literature that addresses Prezi is purely theoretical and speculative: In discussing the pedagogical implications of various presentation software, Harris [ 81 ] mostly just describes Prezi’s features, but does suggest that some of these features provide useful visual metaphors (e.g., zooming in to demonstrate otherwise hidden realities). Bean [ 82 ] offers a particularly compelling analysis of PowerPoint and Prezi’s histories, user interfaces, and visual metaphors, and argues that Prezi is the optimal tool for presenting certain types of information (e.g., wireflow diagrams).

The experimental literature on Prezi is limited to three published studies. Castelyn, Mottart and Valcke [ 14 ] investigated whether a Prezi e-lecture with graphic organizers (e.g., concepts maps) was more effective than a PowerPoint e-lecture without graphic organizers. Claiming that Prezi encourages the use of graphic organizers, they purposefully confounded the type of presentation software with the presence of graphic organizers. Undergraduates randomly assigned to the different e-lectures did not differ in their knowledge or self-efficacy gains, but did prefer the graphically-organized Prezi lecture over the PowerPoint control lecture. In a follow-up study [ 83 ], the same researchers assigned undergraduates to create Prezi presentations that did or did not use graphic organizers, and found no effects of this manipulation on students’ self-reported motivation or self-efficacy. Chou, Chang, and Lu [ 24 ] compared the effects of Prezi, PowerPoint and traditional blackboard instruction on 5 th graders’ learning of geography. Whereas the Prezi group performed better than the control group (which received blackboard instruction) in formative quizzes and a summative test, the PowerPoint group did not; however, on a delayed summative test, both Prezi and PowerPoint students performed better than those in the control group. In direct comparisons of PowerPoint and Prezi, there were no differences in any of the learning measures. Taken together, the studies are not just limited in number: They present uncompelling findings and suffer from the same methodological shortcomings of the PowerPoint research.

The current study

In short, the extant literature does not clarify whether presenters should present with or without visual aids—and, if the latter, whether they should use standard deck-based slideware such as PowerPoint or a ZUI such as Prezi. One of the reasons why these basic questions remain unanswered is the methodological challenges inherent in comparing different presentation formats. We designed the current study to overcome these challenges.

To control for individual differences among presenters, we randomly assigned presenters to different presentation conditions. To control for individual differences among audience members, we used a counterbalanced, within-participants design for the first experiment, and between-participants random assignment in the second experiment. And to draw general inferences about the impact of presentation format—instead of specific inferences about particular presenters or presentations—we sampled from a large number of presentations, each created by a different presenter. Our methods have their own challenges, such as recruiting participants sufficiently trained in all presentation methods, allowing presenters adequate preparation time and context, approximating the psychological conditions of real-world presentations, and measuring the “signal” of presentation format among the added “noise” of so many presenters and presentations. In addition, the studies had to be double-blind: Neither presenters nor audience members could be aware of any hypotheses, and had to be free from any sorts of confirmation bias conveyed by the investigators.

To focus on presentations as a form of presenter-audience communication and limit the number of confounded variables, we purposefully controlled for other possible impacts of presentation software on professional practices or outcomes, including 1) the use of presentation artifacts (e.g., PowerPoint files, printed-out slides, online Prezis), and 2) facilitated collaboration among presentation designers. Unlike other research (e.g., [ 32 , 33 ]) we did allow for the possibility that presentation format not only affects how audiences perceive presentations, but also how presenters design or deliver them (e.g., by increasing their conceptual understanding of the topic, or decreasing their cognitive load during live narration; cf. [ 84 ]). In other words, presentation technologies might affect the cognition of both the audience and the presenter, so we designed the present studies to accommodate both sets of mechanisms.

To maximize the real-world relevance of this research, we relied on multimedia case materials from Harvard Business School [ 85 ]; these materials recreate the actual professional circumstances in which presentations are typically used. Because presentations are designed commonly both to inform and convince audiences, we examine outcome measures of learning as well as persuasion. And to minimize demand characteristics, we avoided the typical flaws of existing research (e.g., researcher-designed presentations, the researchers’ students as research participants) and adopted several countermeasures (e.g., recruitment language and participant instructions that obscured the research hypotheses, between-participant manipulation).

We adopted a two-phased approach in this research. In the first phase, participants with sufficient experience in oral, PowerPoint, and Prezi presentation formats were randomly assigned to create a presentation in one of those formats. We provided the necessary context, instruction, and time to create a short but realistic presentation. Participants then presented live to an actual audience, who judged each presentation’s efficacy. In the second phase, recorded versions of these presentations were presented to a larger online audience, affording us greater statistical power and allowing us to measure the impact of presentation format on decision-making and learning.

Experiment 1

Participants..

We recruited presenter participants via online postings (on Craigslist, the Harvard Psychology Study Pool, the Harvard Decision Science Lab Study Pool), email solicitations to the local Prezi community, and campus flyers. To create the fairest comparison between PowerPoint and Prezi, we recruited individuals who “have expertise in using both PowerPoint and Prezi presentation software.” Interested individuals were directed to a prescreening survey in which they reported their experience with and preference for giving different types of presentations. Only individuals who reported that they were “not at all experienced” with PowerPoint, Prezi or giving oral presentations were excluded from research participation. Out of the 681 respondents who completed the prescreening survey, 456 of them were eligible and invited to sign up for an available timeslot. Out of this group, 146 individuals—105 from the Harvard study pools, 33 from Craigslist, and 8 from the Prezi community—participated as presenters in the study and were compensated $40 for approximately two hours of their time. There were no significant differences between the three presentation groups on any demographics variables.

We also recruited 153 audience participants from the Harvard Decision Science Lab Study Pool and Craigslist using the following announcement:

Do you use Skype? Does your computer have a large screen (13 inches or larger)? If so, you may be eligible to participate in a 45 minute long online study. In this study, you will watch professional presentations over Skype from home on your personal computer.

Anyone who responded to the recruitment notice was eligible, provided that they were available during one of the prescheduled testing sessions. Audience participants were compensated $10 for approximately 45 minutes of their time. Table 2 presents demographic information for the presenter and audience participants. This study was approved by the Harvard Committee on the Use of Human Subjects (Study #IRB14-1427), and all participants in both experiments provided written consent.

thumbnail

https://doi.org/10.1371/journal.pone.0178774.t002

Presenter procedure.

Presenter participants completed a survey remotely before attending the in-person, group sessions with other participants. In the online pre-survey, presenters first answered basic demographic questions (gender, age, education level, English fluency, and occupation). Next, they answered questions about their prior experience with, opinions about, and understanding of the different presentation formats (oral, Prezi, and PowerPoint). This section was prefaced with the following note:

A note on language: When we use the term "presentation," we mean a formal, planned, and oral presentation of any duration, including a public speech, an academic lecture, a webinar, a class presentation, a wedding toast, a sermon, a product demonstration, a business presentation, and so on. Examples of things we do NOT mean are: a theatrical performance, an impromptu toast at dinner, and any presentation with no audience. When we say PowerPoint presentations, we mean presentations that were made using Microsoft PowerPoint, not other software such as Apple's Keynote. When we say Prezi presentations, we mean presentations that were made using Prezi presentation software. Also, when we refer to "oral presentation", we mean a presentation that is only spoken and does not include any visual aids or the use of presentation software.

Participants were asked the following questions for each type of presentation:

  • How experienced are you at making the following types of presentations? [5-level rating]
  • When you give a presentation, how effective are the following types of presentations for you? [5-level rating, with “not applicable” option]
  • When somebody else gives a presentation, how effective are the following types of presentations for you? [5-level rating, with “not applicable” option]
  • How difficult is it for you to make the following types of presentations? [5-level rating, with “not applicable” option]
  • In the last year, approximately how many of the following types of presentations did you make? [free response]
  • In your lifetime, approximately how many of the following types of presentations have you made? [free response]
  • For approximately how many years have you been making the following types of presentations? [free response]

As part of the expertise-related measures, we also asked the participants to identify the purported advantages and disadvantages of each presentation format, according to its proponents and critics, respectively. For PowerPoint and Prezi, we asked participants to identify whether or not it had particular functionalities (e.g., the capacity to record narration, create custom backgrounds, print handouts). Finally, participants viewed three sets of four short Prezi presentations and rank-ordered them from best to worst. In each set we manipulated a key dimension of Prezi effectiveness, according to its designers: the use of zooming, the connection of ideas, and the use of visual metaphor.

Presenter participants were tested in person at the Harvard Decision Science Lab, and randomly assigned to one of the three groups: Prezi, PowerPoint, or oral presentation. A total of 50 data collection sessions were held. In each session, there were typically three presenter participants (one for each presentation format); as a result of participants who failed to arrive or overbooking, there were ten sessions with only two presenters and six sessions with four presenters.

After providing informed consent, participants completed an online survey (in the lab) in which they rank-ordered three sets of recorded example PowerPoint and oral presentations. Identical in form to the example Prezi presentations they judged in the pre-survey, these short presentations were designed to assess their understanding of effective presentation design by manipulating a key aspect specific to each format. For PowerPoint presentations, we manipulated the use of text, use of extraneous “bells and whistles,” and graph design; for oral presentations, the three dimensions were verbal behavior, nonverbal behavior (other than eye contact), and eye contact. In selecting these dimensions (and those for Prezi), we consulted with a variety of experts, including software designers, speaking coaches, and researchers.

Next, presenters were shown material from a multimedia case created for and used by the Harvard Business School. Specifically, they were told the following (the company featured in the business case will be referred to anonymously here as “Company X” to respect their contractual agreement with the school):

For the next two hours, you are going to pretend to be the chief marketing officer of i-Mart, a large chain of retail stores. i-Mart recently made an offer to [Company X] to sell their products in i-Mart stores. Your boss, the CEO of i-Mart, has asked you to make a presentation to [Company X]’s leadership that persuades them to accept i-Mart’s offer. In your presentation, you will need to argue that accepting i-Mart’s offer is in [Company X]’s strategic interests, and address any concerns they may have about how accepting the offer might affect their corporate identity.
As a participant in this study, your primary job today is to prepare and then deliver this presentation. The presentation will be very short (less than 5 minutes) and made live (via Skype) to an audience of participants who are playing the part of [Company X] executives. Before you start planning your presentation, you will first learn more about [Company X] and how they’re thinking about i-Mart’s offer.

On their own computer workstation, participants studied the multimedia case for 30 minutes and were invited to take notes on blank paper provided for them. The multimedia case material included video and textual descriptions of Company’s X’s corporate culture, business model, and constituent communities.

Following this study period, participants were given 45 minutes to create a presentation in one of three randomly assigned presentation formats: PowerPoint, Prezi, or oral. To assist participants in the PowerPoint and Prezi conditions, we provided them with a set of digital artifacts including text, data, and graphics related to the case. Participants were not told that other participants were asked to present in different formats, and the workstations were separated from each other to prevent participants from discovering this manipulation.

After this preparation period, participants were taken individually (in a counterbalanced order) to another room to present to a live audience via Skype. For PowerPoint and Prezi presentations, we shared each participant’s presentation with the audience via screen sharing; thus they viewed both the presenter and the presentation. For those presenters who consented, we also recorded their presentations for future research purposes. After making their presentations, presenters completed a final survey about their presentation (e.g., “How convincing do you think your presentation will be to [Company X’s] board members”), the corporate scenario (e.g., What do you think [Company X] should do?”), and their presentation format (e.g., “How likely are you to recommend the presentation tool or presentation format you used to others to make professional presentations?”).

Audience procedure.

Audience participants completed the entire experiment remotely and online. Their participation was scheduled for the end of the presenter sessions so that the in-lab presenters could present live to a remote audience via Skype. We recruited between three and six audience participants per session, although participants who failed to arrive or Skype connectivity issues resulted in some sessions with only one or two audience participants: Five sessions had one participant, twelve sessions had two participants, sixteen sessions had three participants, eleven sessions had four participants, four sessions had five participants, and two sessions had six participants.

Individuals who responded to the recruitment notice completed a consent form and three online surveys prior to their scheduled Skype session. The first survey was a slightly modified form of the presenter pre-survey (demographics, background on presentation formats, rank-ordering of example Prezis) in which they also scheduled their Skype session. In the second survey, audience participants were told that they were “going to play the role of a corporate executive listening to several short business presentations,” and that their task was “to evaluate the quality of these presentations, each made by another participant engaged in a similar role-playing scenario.” They were then shown a brief video and textual description of the fictionalized corporate scenario (an abridged version of what presenter participants studied), and told the following:

You are a board member for [Company X], an innovative clothing company. Another company, i-Mart, wants to sell [Company Y’s products] in its stores. You and your fellow board members must decide whether or not to accept i-Mart's offer.

And in the third survey they rank-ordered the three sets of recorded example PowerPoint and oral presentations.

At the time of the scheduled session, the audience participants logged into Skype using a generic account provided by the research team, and were instructed to turn on their webcams and put on headphones. Once the first presenter participant was ready to present, the experimenter initiated the group Skype call, confirmed that the software was functioning properly, invited the presenter into the room to begin, left the room before the start of the presentation, monitored the presentation remotely via a closed-circuit video feed, and re-entered the room at the presentation’s conclusion. For Prezi and PowerPoint presentations, Skype’s built-in screen-sharing function was used to share the visual component of the presentation; audience participants viewing these presentations were instructed to use the split-screen view, with windows of equal size showing the presenter and the accompanying visuals.

Immediately after viewing each presentation, participants evaluated it via an online survey. They rated each presentation on how organized, engaging, realistic, persuasive, and effective it was using a five-level scale with response options of not at all , slightly , somewhat , very , and extremely . They were also invited to offer feedback to the presenter on how the presentation could be improved. After the final presentation, participants rank-ordered the presentations on the same dimensions (e.g., effectiveness, persuasiveness). Halfway through the experiment we added a final question in which we asked participants to rank-order PowerPoint, Prezi, and oral presentation formats “in terms of their general effectiveness, ignoring how well individual presenters (including today's) use that format,” and to explain their rank-ordering.

Prior experience and pre-existing beliefs.

Participants’ prior experience with and pre-existing beliefs about each presentation format provide a baseline that informs the research findings. If presenter participants had more experience with and more positive beliefs about one format than the others—and those assigned to that format induced more positive assessments from the audience members than did those assigned to the other formats—then the results are less compelling than if there was no correlation between these baseline measures and the experimental outcomes. The same applies to audience participants: Are they merely judging presentations according to their initial biases? Conversely, the results are most compelling if there is a negative association between the baseline measures and the experimental findings. For this reason—and to check that presenters assigned to the different formats did not happen to differ in these baseline measures—we analyzed participants’ prior experience with and pre-existing beliefs about PowerPoint, Prezi, and oral presentation formats.

Both audience and presenter participants were least experienced with Prezi and most experienced with oral presentations. At the outset, they rated PowerPoint as the most effective and easiest to use to present material and Prezi as the least effective and most difficult to use to present. For watching presentations, audience participants rated PowerPoint most effective and oral presentations least effective, but rated Prezi as more enjoyable than other formats. For watching presentations, presenter participants did not find any format more effective than the others. Table 3 presents full descriptive and inferential statistics for all self-reported measures of prior experience with and preexisting beliefs about Prezi, PowerPoint, and oral presentations.

thumbnail

https://doi.org/10.1371/journal.pone.0178774.t003

Presenters assigned to different formats did not differ in their experience with or pre-existing beliefs about presentations formats. They also did not differ in how well they identified the purported advantages and disadvantages of each presentation format, how well they identified the software features of PowerPoint and Prezi, or how accurately they could identify effective presentations of each format.

Audience ratings.

In term of their prior experience with and pre-existing beliefs about presentation formats, both audience and presenter participants were biased in favor of oral and PowerPoint presentations and against Prezi. After presenters were randomly assigned to these different formats, how did the audience evaluate their presentations?

In examining how presentation format affected the audience’s ratings of the presentations, two complications arose. First, sessions with two presentations were missing one presentation format, and sessions with four presentations had two presentations of the same format. To address this complexity we only conducted pairwise comparisons of different formats (e.g., PPT versus oral) instead of omnibus tests, and—for those sessions with four presentations—we averaged ratings for the two same-format presentations. To be certain that the differing number of presentations per session did not somehow bias the results even after adopting these measures, we also conducted an analysis on the subset of sessions that had exactly three presentations.

Second, the number of audience participants per session ranged from one to six. In calculating descriptive statistics, some sessions would be weighted more heavily than others unless ratings were first averaged across participants within the same session, then averaged across sessions. In calculating inferential statistics, averaging across ratings from different participants within the same session who received presentations in the same format was necessary to ensure that the sampling units were independent of each other, an assumption of all parametric and most nonparametric tests. In other words, for both descriptive and inferential statistics, we treated session (instead of participant) as the sampling unit.

As an empirical matter, this multi-step averaging—within participants across identical presentation formats, then across participants within the same session—had little impact on the condition means (i.e., the average ratings of PowerPoint, Prezi, or oral presentations on each dimension). Compared to the simplest, raw averaging of all ratings in one step, the maximum absolute difference between these two sets of means was .07 (on a 1–5 scale) and the mean absolute difference was .04.

To test whether the presentations’ format affected their ratings, therefore, we conducted paired t -tests for each rating dimension, with presentation format as the repeated measure and mean session rating as the dependent variable. Because we conducted three tests for each dimension—pairing each format with every other—we controlled for multiple comparisons by dividing our significance threshold by the same factor (i.e., α = .05/3 = .017). Results revealed that presentation format influenced audience ratings. In particular, the audience rated Prezi presentations as significantly more organized, engaging, persuasive, and effective than both PowerPoint and oral presentations; on a five-level scale, the average participant rated Prezi presentations over half a level higher than other presentations. The audience did not rate PowerPoint presentations differently than oral presentations on any dimension. Table 4 and Fig 1 present these results.

thumbnail

https://doi.org/10.1371/journal.pone.0178774.t004

thumbnail

Audience members rated presentations on each dimension on a 5-level scale (1 = “not at all,” 5 = “extremely”). The figure shows session-level means from all available data, including those from sessions with two or four presentations.

https://doi.org/10.1371/journal.pone.0178774.g001

By limiting the analysis to the 34 sessions with exactly three presentations (one of each format), we could ensure that the sessions with two or four presentations did not somehow bias the results. Moreover, this procedure enabled us to conduct omnibus tests of presentation format for each rating dimension. These omnibus tests revealed significant effects for organization, F (2,66) = 12.9, p < .0001, engagement, F (2,66) = 4.6, p = .01, persuasion, F (2,66) = 3.9, p = .03, and effectiveness, F (2,66) = 7.2, p = .001. The results from post-hoc tests (Fisher’s LSD) aligned with the original pairwise comparisons: On all dimensions, the audience rated Prezi presentations higher than PowerPoint and oral presentations, p s < .05; PowerPoint and oral presentations were not rated differently on any dimension, p s>.05. (Note: All p -values for pairwise tests here and elsewhere are two-tailed.)

To explore whether the obtained results were somehow the result of demand characteristics, we analyzed ratings from only the first presentation in each session. This analysis yielded the same pattern of findings, with a to-be-expected reduction in statistical significance due to the loss of power. On all four dimensions, a one-way, independent-measures ANOVA yielded significant or marginally-significant results: organized, F (2,49) = 5.1, p = .01; engaging, F (2,49) = 2.5, p = .09; persuasive, F (2,49) = 2.6, p = .09; and effective, F (2,49) = 5.8, p = .006. In all cases, Prezi was rated higher than oral and PowerPoint presentations (post-hoc LSD p s ≤.08).

On average, the audience rated the presentations as realistic, with a modal rating of “very realistic.” Our intent in including this rating dimension was merely to verify that our experimental protocol resulted in realistic rather than contrived presentations; we therefore did not test for differences in these ratings as a function of group differences.

Audience rankings.

As just noted, participants randomly assigned to present using Prezi were rated as giving more organized, engaging, persuasive, and effective presentations compared to those randomly assigned to the PowerPoint or oral presentation conditions. In addition, at the end of each session audience participants rank-ordered each type of presentation on the same dimensions used for the ratings. Here we ask: Did the audiences’ rank-orderings align with the ratings?

The same complexities with the ratings data—the variable number of conditions and audience participants per session—applied as well to the ranking data. We therefore adopted a similar analytic strategy, with one exception: we conducted non-parametric rather than parametric pairwise tests, given the rank-ordered nature of the raw data and distributional assumptions that underlie parametric tests.

Using the session-level mean ranks, we tested the effect of presentation format with three sets of Wilcoxon signed-rank tests. The results had the identical pattern as those from the ratings data: the audience rated Prezi presentations as significantly more organized, engaging, persuasive, and effective than both PowerPoint and oral presentation (all p s ≤ .006); the audience did not rate PowerPoint presentations differently than oral presentations on any dimension. Table 5 and Fig 2 present these results.

thumbnail

https://doi.org/10.1371/journal.pone.0178774.t005

thumbnail

Audience members ranked the presentations from best to worst, with lower ranks indicating better presentations. The figure shows session-level means from all available data, including those from sessions with two or four presentations.

https://doi.org/10.1371/journal.pone.0178774.g002

As with the ratings data, we also conducted omnibus tests of only those sessions with exactly three presentations to validate that unbalanced sessions did not somehow bias the results. These tests (Friedman ANOVAs) revealed significant effects for organization, exact p = .0005, engagement, exact p = .04, and effectiveness, exact p = .003; we found only a marginally significant effect for persuasion, exact p = .08. Post-hoc tests (Fisher’s LSD) showed that the audience ranked Prezi presentations higher than PowerPoint and oral presentations on all dimensions, p s < .05; PowerPoint and oral presentations were not ranked differently on engagement, persuasion, or effectiveness, p s>.05, but the audience did rank PowerPoint presentations as more organized than oral presentations, p = .04.

Audience omnibus judgments of effectiveness.

Before and after the experimental session, audience participants judged the general effectiveness of the three presentation formats. In the pre-survey, they rated each format on its effectiveness for them as presenters and audience members. In the post-survey, they rank-ordered the formats on their “general effectiveness” and were instructed to ignore “how well individual presenters (including today's) use that format.” Although the pre- and post-questions differed in their phrasing and response formats, they nonetheless afford us an opportunity to investigate if and how their judgments changed over the course of the experiment.

As already described (see Table 3 ), the audience began the experiment judging PowerPoint presentations as most effective for presenters and audiences. They ended the experiment, however, with different judgments of efficacy: A majority (52%) ranked Prezi presentations as the most effective, a majority (57%) ranked oral presentations as least effective, and a plurality (49%) ranked PowerPoint presentations second in effectiveness. A Friedman’s ANOVA test (on the mean rankings) confirmed that participants rated presentation formats differently, exact p = .00007. Post hoc analysis with Wilcoxon signed-rank tests revealed that the audience ranked both Prezi and PowerPoint presentations as more effective than oral presentations, ps ≤.003). They did not rank Prezi and PowerPoint presentations significantly differently ( p = .15). Fig 3 presents these results.

thumbnail

Note: Means shown from pre-survey items are calculated based on responses from all participants (as opposed to only those who had experience with all presentation formats).

https://doi.org/10.1371/journal.pone.0178774.g003

In the pre-survey, some audience participants reported prior experience viewing Prezi presentations but others did not (i.e., those who selected the “not applicable” response option). Compared to participants with no prior experience watching Prezi presentations ( n = 34), participants with prior Prezi experience ( n = 117) rated PowerPoint presentations (but not oral presentations) as less effective, t (149) = 2.7, p = .007, mean difference = .47, and less enjoyable for them, t (149) = 2.9, p = .004, mean difference = .53. Thus, prior experience with Prezi was associated with negative pre-existing judgments of PowerPoint.

Audience correlates of presentation ratings and rankings.

What, if any, individual-level variables—demographics and baseline survey responses—correlated with the audience’s judgments of the presentations? If, for example, the more experience the audience had with Prezi, the worse they evaluated those presentations, such a correlation would suggest that the current findings reflect a novelty effect.

We did not find any significant relationships between the audiences’ prior experience with a given presentation format (presenter experience rating, number of years, number of presentations watched last year or lifetime) and their ratings or rank-orderings of that presentation format on any dimensions, all | r| s < .16. The only pre-existing audience beliefs about the presentation formats (presenter effectiveness, presenter difficulty, audience effectiveness, audience enjoyableness) that correlated with their ratings or rankings were for oral presentations: the more effective participants rated oral presentations for them as audience members before the experiment, the more effective they rated and ranked oral presentations in the experiment as engaging, r = .22 and .26, respectively, p s < .01.

Among demographic variables, only age showed reliable correlations with the audiences’ evaluations of presentations: the older the participant, the more effective they rated PowerPoint presentations, r = .23, p = .007, the more persuasive they ranked PowerPoint presentations, r = .24, p = .006, and the less organized and persuasive they rated oral presentations, r = -.32, p = .001, and r = -.21, p = .01, respectively.

Audience participants’ success in distinguishing better from worse presentations of each format (i.e., their rank-ordering of short expert-created examples) did not correlate with their evaluations of the experimental presentations, nor did it correlate with the audiences’ self-reported experience with each format.

Audience free response.

Although we cannot assume that participants understood the reasons behind their rank-orderings (cf. [ 86 ]), their explanations may nonetheless offer some insight into how they perceived different presentation formats. In explaining their rank-ordering of the presentation formats in terms of their general effectiveness, 8% of participants who preferred Prezi mentioned that it was new or different or that PowerPoint presentations were old or outdated . More commonly, they described Prezi as more engaging or interactive (49%), organized (18%), visually interesting , visually compelling , visually pleasing , sleek , or vivid (15%), or creative (13%). Of participants who preferred PowerPoint, 38% described it as more concise , clear , easy to follow , familiar , professional , or organized than the other presentation formats. An equal percentage explained their choice in terms of negative judgments of Prezi, including comments that Prezi was disorienting , busy , crowded , amateurish , or overwhelming . Participants who rank-ordered oral presentations as most effective remarked that they felt more engaged or connected with the presenter, could better give their undivided attention to the presentation (29%), valued the eye contact or face-to-face interaction with the presenter (14%), or found presentation software distracting (14%).

Presenter outcomes and correlates of success.

A series of one-way ANOVAs revealed that presentation format did not affect the presenters’ judgments about the business scenario (e.g., “What do you think [Company X] should do?”), self-reported comprehension of the business scenario (“How much do you think you understand the situation with [Company X] and i-Mart?”), or ratings of their own motivation (e.g., “This activity was fun to do”), self-efficacy (e.g., “I think I am pretty good at this activity”), effort (e.g., “I tried very hard on this activity), and effectiveness as presenters (“How convincing do you think your presentation will be to [Company X]’s board members?”); participants using different presentation formats also did not differ in their performance on the multiple-choice test about the business scenario, all p s >.05.

The presenter groups did differ in how inclined they were to recommend their presentation format to others (“How likely are you to recommend the presentation tool or presentation format you used to others to make professional presentations?”), F (2,144) = 4.2, p = .02, with presenters who used Prezi or PowerPoint being more likely to recommend their format than those who made oral presentations, LSD p = .03 and p = .007, respectively.

Presenter variables—including demographic characteristics and experience with their assigned format—generally did not predict their presentation success, either in terms of audience ratings or rankings. The one exception was that Prezi presenters who were better able to identify effective Prezi presentations were rated and ranked as giving more effective and engaging presentations, .008 < p s < .04.

Participants who were randomly assigned to present using Prezi were judged as giving more effective, organized, engaging, and persuasive presentations than those who were randomly assigned to present orally or with PowerPoint. This was true despite the fact that both audience and presenter participants were initially predisposed against Prezi. What might explain these findings?

One explanation is a novelty effect: Perhaps the audience preferred Prezi simply because it is relatively new to them. It appears that this was not the case, however: Only 8% of participants claimed that they preferred Prezi because it was new or different, and there was no significant relationship between the audiences’ experience with Prezi and their ratings or rank-orderings.

Another explanation for these results is that the presenters or audience members were somehow biased towards the Prezi presentations. Again, however, this appears not to be the case. The presenters were least experienced in Prezi, judged themselves least effective presenting with Prezi, and found Prezi presentations hardest to create. We recruited only a small minority (8%) of presenters based on their prior association with Prezi, and used the most conservative exclusion criteria feasible: only individuals without any experience with Prezi or PowerPoint were excluded from participating. All presenters were randomly assigned to their presentation format and were blind to the experimental manipulation. In recruiting audience participants, we did not mention Prezi or PowerPoint, and selected participants only based on their access to Skype and a sufficiently large computer screen. In addition, we minimized contact between the investigator and research participants, and presentations were never identified based on their format; at the end of the experiment, in fact, some participants did not even realize that they had seen a Prezi presentation (as evidenced by their free responses). Data were collected through standardized, online surveys, the investigator was not in the room with the presenter during his or her presentation, and the investigator interacted with the audience only briefly to set up their Skype session. Finally, an analysis of ratings from only the first presentations yielded the same results as the full analysis, making implausible an interpretation based on audience demand characteristics.

Thus, the most likely explanation is that individuals do, in fact, perceive Prezi presentations more favorably than PowerPoint or oral presentation. Experiment 1 has several limitations, however. First, because each audience participant in Experiment 1 was exposed to multiple presentations, we were unable to evaluate presentations on their ultimate goal: to convince the audience (role-playing Company X board members) to accept i-Mart’s business offer. In other words, Experiment 1 demonstrated that Prezi presentations are more effective than other formats in terms of audience perceptions but not decision-making outcomes. Second, we asked the audience about their pre-existing beliefs and prior experiences with PowerPoint, Prezi, and oral presentations at the beginning of the Experiment 1; although it is difficult to imagine how this questioning could have produced the obtained results—particularly given the nature of their pre-existing beliefs and prior experiments—it is a remote possibility. Third, just like the results from any single experiment, the findings of Experiment 1 should be treated cautiously until replicated. We designed a second experiment to address these limitations and extend the findings from the first experiment.

Experiment 2

In Experiment 2 we showed online participants a single presentation from Experiment 1, and varied randomly which type of presentation (Prezi, PowerPoint, or oral) they viewed. We also randomly assigned some participants to view a presentation on material that was not related to the case material; this control condition served as a baseline that allowed us to estimate the impact of each presentation format. To minimize demand characteristics, we asked participants about their experiences with different presentation formats at the conclusion of the experiment (instead of the beginning), and did not expose participants to multiple presentation formats. Finally, to investigate better the nature of participants’ perceptions about presentation effectiveness, we distinguished between perceptions about the presentation, the presenter, and the audiovisual component of the presentation.

We recruited native-English speaking participants via Amazon’s Mechanical Turk using the following language: “In this study, you will read a business case, watch presentations, assume a role, and make a decision.” They were compensated $4 for approximately one hour of their time. Excluding pilot participants who offered us initial feedback on the survey and protocol, 1398 individuals consented to and began the experiment. Of these, 16 participants were excluded because of evidence that they didn’t complete the task properly (e.g., answering a long series of questions identically, incorrectly answering a “trap” question), and 305 were excluded because they dropped out before completing all of the outcome measures, leaving 1069 participants in the final dataset: 272 in the Prezi group, 261 in the PowerPoint group, 275 in the oral presentation group, and 261 in the control group. The number of excluded participants did not covary with group assignment or demographic variables. Table 6 presents demographic information on the included participants.

thumbnail

https://doi.org/10.1371/journal.pone.0178774.t006

The main stimuli for this experiment consisted of recorded presentations from Experiment 1. For Prezi and PowerPoint presentations, these were split-screen videos showing the presenter on one side of the screen and the visuals on the other side. For the oral presentations, these were simply audiovisual recordings of the presenter.

Of the 146 presenter participants from Experiment 1, 33 either did not consent to being video-recorded or were not recorded due to technical difficulties. We therefore had a pool of 113 presentation videos to use for Experiment 2: 41 from the Prezi condition (out of a possible 50), 40 from the PowerPoint condition (out of possible 49), and 32 from the oral presentation condition (out of a possible 47). The proportion of presentations that were video-recorded did not vary with their format, exact p = .61.

Some of the recorded presentations from Experiment 1 were unusable because of intractable quality issues (e.g., inaudible speech, incomplete video, partially occluded presenter), leaving a total of 89 usable videos (34 Prezi, 28 PowerPoint, 27 oral). The proportion of videos removed because of quality issues did not vary with presentation format, exact p = .57.

We randomly selected 25 videos in each format, resulting in a total pool of 75 videos. Because of a URL typo that was not detected until after testing, one PowerPoint video was not presented and participants assigned that video were not able to complete the experiment. Video length varied by format, F (2, 71) = 4.2, p = .02, with PowerPoint and Prezi presentations lasted longer than oral presentations ( M = 5.9, 6.0, and 4.6 minutes, respectively).

We were concerned that we could have, perhaps unconsciously, selected better stimuli in the Prezi condition, which would have biased the results. To ensure that our judgments of major audiovisual problems and subsequent exclusion of some videos were not biased, we recruited a separate group of participants to rate the audiovisual quality of the 113 presentation videos. Using the following language, we recruited 455 individuals from Amazon’s Mechanical Turk to serve as judges:

In this study you will judge the technical quality of three short videos. To participate you must have a high-speed Internet connection. We will compensate you $2 for 15–20 minutes of your time.

These participants were totally blind to the experimental hypotheses and manipulation. They completed the audiovisual rating task completely online via the Qualtrics survey platform, and were given the following instructions:

We need your help in determining the audiovisual quality of some Skype presentations we recorded. We want to know which presentations we can use for additional research, and which need to be eliminated due to major technical problems with the recordings. The sorts of technical problems that might exist in some of the videos are: incomplete recordings (the recording starts late or stops early), cropped recordings (the camera isn’t positioned properly), choppy or blurry video, and absent or inaudible audio.
You will watch a single presentation video. Please ignore any aspect of the recording other than its audiovisual quality. In particular, do not base your judgments on the presentation itself, including the presenter’s argument, appearance, or the nature of the accompanying slides. The only thing we care about is whether the audio and video were recorded properly.
Finally, please keep in mind that because these videos were recorded through Skype, even the best recordings are not very high quality.

These judge participants then watched a presentation video (selected at random), rated the quality of its audio and video (on a five-level scale from “very bad” to “very good”), and indicated whether or not there were “any major technical problems with the presentations audio or video”; those who reported major technical problems were asked to identify them.

To address any possibility of experimenter bias—which seemed unlikely, given that we designed the procedure from the outset to guard against such effects—we conducted a series of Presentation Format (Prezi, PowerPoint, oral) x Quality Judgment (inclusion, exclusion) ANOVAs to test 1) whether audiovisual quality was for any reason confounded with presentation format (i.e., the main effect of Presentation Format), 2) whether the excluded videos were indeed lower quality than the included videos (i.e., the main effect of Quality Judgment), and 3) whether our exclusion of videos was biased based on their format (i.e., the interaction between Presentation Format and Audiovisual Quality). We conducted the ANOVAs on the three measures of audiovisual quality collected from the independent judges: ratings of audio quality, ratings of video quality, and judgments of major audiovisual problems.

The results were straightforward: For all three dependent variables, there were no main effects of Presentation Format, p s > .13, but we did find a significant main effect of Quality Judgment (with included videos being judged better quality than excluded videos), all p s < .002, and did not find any interaction effects, all p s > .31. In other words, presentation format was not confounded with audiovisual quality, our judgments of quality corresponded to those of blind judges, and our exclusion of videos was unrelated to presentation format.

Participants completed the experiment entirely online through Qualtrics. After providing informed consent, and answering preliminary demographic and background questions (e.g., about their familiarity with business concepts and practices) they were told the following:

In this part of the study, you are going to play the role of a corporate executive for [Company X], an innovative clothing company. Another company, i-Mart, wants to sell [Company X’s] t-shirts in its many retail stores. You must decide whether or not to accept i-Mart's offer.
To help you make your decision, we will first provide you with some background on [Company X] and the i-Mart offer. You will see a series of short videos and text that describe relevant aspects of [Company X’s] origins, business model, practices, culture, and community. Please review this background material carefully.

Participants were then shown a series of brief video and textual descriptions of the fictionalized corporate scenario, including information on Company X’s business model, business processes, community, and culture. This material was an abridged version of what Experiment 1 presenter participants studied, but an expanded version of what Experiment 1 audience participants studied.

After viewing the multimedia case material, the participants were asked to identify what product Company X sells (a “trap” question to exclude non-serious participants) and to rate the background material on how engaging it was, how much they enjoyed it, how much they paid attention to it, and how difficult it was to understand.

Participants randomly assigned to the Prezi, PowerPoint, and Oral Presentation conditions were then told the following:

Now that you know a little bit about the company, you will watch a video presentation from another research participant. Just as you are playing the role of a [Company X] executive, the other participant is playing the role of i-Mart's Chief Marketing Office (CMO). In this presentation, he or she will try to convince you and your fellow [Company X] executives to accept i-Mart's offer.
Because this presentation is from another research participant playing the role of an i-Mart executive--and not an actual i-Mart executive--please disregard the presenter's appearance (clothing, age, etc). And because we did not professionally videorecord the presentation, please also try to disregard the relatively poor quality of the video compared to the videos you just viewed.
The purpose of this research is to understand what makes presentations effective. So please listen carefully and do your best to imagine that this is "real".

Identically to Experiment 1, participants rated the presentation on how organized, engaging, realistic, persuasive, and effective it was on a five-level scale from “not at all” to “extremely.” Using the same scale, these participants also rated the presenter on how organized, engaging, persuasive, effective, confident, enthusiastic, knowledgeable, professional, nervous, and boring he or she was.

Participants in the Prezi and PowerPoint groups were asked three additional questions. First, they were asked to rate the visual component of the presentation (i.e., the Prezi or the PowerPoint slides) on how organized, engaging, persuasive, effective, dynamic, visually compelling, distracting, informative, distinctive, and boring it was. Second, they were asked to rate whether the presentation had “not enough”, “too much” or an “about right” amount of text, graphs, images, and animations. And finally, there were asked to comment on the visual component of the presentations, including ways in which it could be improved.

All participants then summarized the presentation in their own words, with a minimum acceptable length of 50 characters. Participants were asked to rate how well they understood the “situation with [Company X] and I-Mart,” and to decide whether [Company X] should accept or reject i-Mart’s offer (on a 6-level scale, with the modifiers “definitely,” “probably,” and “possibly”).

In addition, we asked participants a series of recall and comprehension questions about the case. An example recall question is “According to the background materials and the presentation, approximately how many members does [Company X] have?”, with four possible answers ranging from 500,000 to 1.5 million. An example comprehension question is “According to the background materials, what is the biggest challenge [Company X] is facing?”, with possible answers ranging from “marketing” to “logistics.” These comprehension questions were based on the instructor’s guide to the business case material, and included open-ended questions (“Why do you think [Company X] should accept or reject i-Mart's offer?”). At this point we also asked another trap question (“What is 84 plus 27?”).

Finally, and after answering all questions about the business case and presentation, participants answered background questions about their experience with, knowledge of, and general preference for different presentation formats. They also rank-ordered the mini examples of Prezi, PowerPoint, and oral presentations in terms of their effectiveness. These background questions and tasks were the same as those used in Experiment 1.

Participants in the control condition completed the same protocol, with a few exceptions: First, instead of being shown presentations from Experiment 1, they viewed one of three instructional videos (matched for length with the Experiment 1 presentations). Before they viewed these videos they were told “Before you decide what to do about i-Mart's offer to [Company X], we would like you to watch an unrelated presentation and briefly answer some questions about it.” Second, they did not rate how realistic the presentation was, nor did they rate the visual component on how organized, engaging, persuasive, effective, dynamic, visually compelling, distracting, informative, distinctive, and boring it was. And finally, they did not complete the final set of background questions on the different presentation formats or rank-order the example presentations.

At the outset, participants rated oral and PowerPoint presentations as equally effective in general, and Prezi presentations as less effective than the other two formats. Just as we found in Experiment 1, participants rated themselves as more experienced and effective in making and oral and PowerPoint presentations compared to Prezi presentations. They also rated oral and PowerPoint presentations as more enjoyable and effective for them than viewing Prezi presentations. When asked how difficult it was to make the different types of presentations, they rated Prezi as more difficult than oral and PowerPoint presentations, and oral presentations as more difficult than PowerPoint ones. In terms of the number of presentations watched in the last year and in their lifetime—as well as the number of years of experience—they reported more experience watching oral compared to PowerPoint presentations, and more experience watching PowerPoint than watching Prezi presentations. The same pattern was true for their reported experience in making presentations, with one exception: They reported making more PowerPoint than oral presentations in their lifetime. Table 7 presents full descriptive and inference statistics for all self-reported measures of prior experience with and preexisting beliefs about Prezi, PowerPoint, and oral presentations. The experimental groups did not differ significantly on any of these variables.

thumbnail

https://doi.org/10.1371/journal.pone.0178774.t007

Most participants (78%) were either “not at all familiar” or “slightly familiar” with Company X, and the modal participant reported being “somewhat experienced” with “concepts and practices from the business world, such as strategy, innovation, product development, sales, and marketing.” The groups did not differ significantly on these variables, nor did they differ on demographic variables such as age, gender, or education.

For overall judgments of the presentations, participants rated Prezi as more organized, effective, engaging, and persuasive than PowerPoint and oral presentations, and rated PowerPoint no differently than oral presentations. They also rated Prezi presenters as more organized, knowledgeable, effective, and professional than PowerPoint presenters and oral presenters; Prezi presenters were not rated differently from other presentations on how nervous, boring, enthusiastic, confident, persuasive, or engaging they were, and PowerPoint presenters were rated no differently than oral presenters on all dimensions. In judging the visual components of the Prezi and PowerPoint presentations, the audience rated Prezi presentations as more dynamic, visually compelling, and distinctive than PowerPoint slides, and marginally more effective and persuasive.

Examining the magnitude of mean differences, some effects are clearly larger than others. Most notably, Prezi presentations are rated as most organized and visually dynamic, and Prezi presenters are rated as most organized. Fig 4 and Table 8 present the descriptive and inferential statistics, respectively, for these audience ratings.

thumbnail

https://doi.org/10.1371/journal.pone.0178774.t008

thumbnail

Note: rating dimensions are ordered by the magnitude of the difference between Prezi and the other presentation formats; for dimensions with no significant differences between presentation formats, only the overall mean is displayed.

https://doi.org/10.1371/journal.pone.0178774.g004

The modal participant rated the background case material on Company X as “very engaging” and “completely enjoyable,” reported “mostly” understanding the situation with i-Mart and Company X, and rated the presentations as “very realistic.” Seventy percent of participants expected to do “somewhat well” or “very well” when quizzed about the case. There were no significant group differences on any of these variables.

Audience decision-making.

Did the presentations actually influence participants’ core judgment of the business scenario and, if so, was one presentation format more effective than others?

Participants who received a Prezi presentation accepted i-Mart’s offer 53.7% of the time, participants who received a PowerPoint presentation accepted the offer 49.8% of the time, participants exposed to an oral presentation accepted it 45.5% of the time, and participants exposed to the control presentation accepted it 37.5% of the time (see Fig 5 ). In an omnibus test, these differences were significant, exact p = .002. Specific comparisons revealed that Prezi presentations were significantly more influential than control presentations, exact p = 0003, marginally more influential than oral presentations, exact p = .06, and no more influential than PowerPoint presentations, exact p = .39; PowerPoint presentations were significantly more influential than control presentations, exact p = .006, but not oral presentations, exact p = .34; oral presentations were marginally more influential than control presentations, exact p = .07. In order to investigate the impact of presentation software on decision-making, we contrasted the Prezi and PowerPoint groups with the oral presentation groups. We found a marginally significant effect, exact p = .06.

thumbnail

https://doi.org/10.1371/journal.pone.0178774.g005

On the whole, therefore, the participants’ decision-making results were concordant descriptively (if not always inferentially) with the rating results.

If participants’ perceptions of the presentations and decisions about the case were both influenced by presentation format, then we would expect them to be associated with each other. And this is indeed what we found. Excluding participants in the control group (who did not make judgments about comparable presentations), those who rejected the i-Mart offer rated presentations as worse than those who accepted the i-Mart offer. This was true for 23 of the 24 rating dimensions (“visually boring” was the exception), with the largest effects for ratings of effectiveness and persuasiveness. Those who rejected the offer rated the overall presentation, visual aids, and presenter as less effective than those who accepted the offer, with effect sizes (Cohen’s d ) of .93, .83, and .78, respectively. These effects were consistent across formats, all interaction p s > .05.

We conducted an analogous set of analyses that preserved the original 6-level scale of the decision variable (“possibly accept,” “probably accept,” “definitely accept,” “possibly reject,” “probably reject,” “definitely reject”). These analyses produced qualitatively identical results, both in terms of decision-making as a function of group assignment and the correlation between decision-making and presentation ratings.

Memory and comprehension.

Participants’ performance on the four rote memory questions did not vary across conditions, nor did their correct identification (according to the case designers) of reasons to accept or reject the offer, with one exception: Compared to those in the treatment groups, control participants were more likely to identify Company X’s ability to meet production demand as a reason to reject the i-Mart, omnibus exact p = .00004.

Correlates of presentation outcomes.

There were no notable correlations between demographic variables and participants’ ratings or decisions. In particular, participants’ experience with or preexisting beliefs about each presentation format did not correlate with their ratings of the experimental presentations, mirroring the results from Experiment 1 (but with much greater statistical power). Presentation length or recording quality (as assessed by the independent judges) did not correlate with presentation outcomes.

Participants’ success in distinguishing better from worse presentations of each format—that is, their rank-ordering of short expert-created examples—correlated slightly with their evaluations of the presentations. Most notably, the better participants did on the rank-ordering PowerPoint task, the worse they rated PowerPoint (but not Prezi) presentations on visual dimensions; the same was true for the Prezi task and presentations. For example, participants’ performance in the PowerPoint task correlated negatively with their judgments of how “visually dynamic” PowerPoint presentations were, r = -.22, p = .0005, and participants’ performance on the Prezi task correlated negatively with their judgments of how “visually dynamic” Prezi presentations were, r = -.16, p = .009. Thus, individuals with more expertise in PowerPoint and Prezi were more critical of PowerPoint and Prezi presentations, respectively.

Audiovisual attributes of Prezi and PowerPoint presentations.

To understand the media attributes and psychological mechanisms that underlie the observed effects of format, we examined how participants’ judgments about amount of text, graphs, animations, and images in the presentations correlated with their judgments of the presentations, the visual component of the presentations, and the presenters themselves. To examine these relationships, we conducted one-way ANOVAs with the various ratings as the dependent variables, and participants’ judgments (“not enough,” “about right,” “too much”) about the amount of text, graphs, animations, and images in the PowerPoint and Prezi presentations as the independent variable. For nearly all (80 of 96) of these ANOVAs, the results were highly significant, p s < .001. In judging the amount of text, participants typically rated “too much” or “not enough” text as worse than an “about right” amount; in judging graphs, images, and animations, participants typically rated “too much” and “just right” both as equally better than “not enough.” Averaging across all rating dimensions, the text and graph effects were over twice as large as the animation and image effects; averaging across all attributes, the effects for visual ratings was over twice as large as the effects for presenter and overall ratings. Participants’ judgments about the media attributes of presentations did, therefore, relate to their overall assessments of the presenters and presentations.

Summing across PowerPoint and Prezi presentations, the modal participant indicated that there was the “about right” amount of text, graphs, animations, and images. Only 21% of participants thought there was not enough or too much text; for the other dimensions, this percentage ranged from 42–51%. More participants indicated that there was not enough text, graphs, and animations in PowerPoint presentations than Prezi presentations, with animation as the most distinguishing attribute. Table 9 presents the descriptive and inferential statistics for these variables.

thumbnail

https://doi.org/10.1371/journal.pone.0178774.t009

As shown in Table 10 , participants’ judgments about the audiovisual attributes of the Prezi and PowerPoint presentations were associated with the decision about the business scenario. Individuals who reported that there was not enough text, graph, animation, or images tended to reject the offer for i-Mart, whereas those who reported that there was the “about right” amount of those attributes tended to accept the offer. This effect was particularly pronounced for judgments of graphs and text. Participants who reported too much text also tended to reject the offer.

thumbnail

https://doi.org/10.1371/journal.pone.0178774.t010

In sum, participants’ perceptions of presenters and the presentations correlated with their evaluations of the amount of text, graphs, images, and animations that were included in the presentations. Presenters and presentations were rated worse if they had too much or not enough text, and not enough graphs, images, and animations; in terms of audience decision-making, presentations were less effective if they contained too much or not enough text, or not enough graphs, animations, and images. PowerPoint presentations were judged to have too little of all attributes, particularly animation.

Replicating results from Experiment 1, participants rated presentations made with Prezi as more organized, engaging, persuasive, and effective than both PowerPoint and oral presentations. This remained true despite participants’ preexisting bias against Prezi and the different context of Experiment 2: the audience did not view multiple presentations of different formats and presentations were prerecorded instead of live. Extending the Experiment 1 results, participants also judged Prezi presentations as better in various ways (e.g., more visually compelling, more dynamic) than PowerPoint presentations; participants even rated Prezi presenters more highly (e.g., more knowledgeable, more professional) than PowerPoint presenters.

In making decisions as corporate executives, participants were persuaded by the presentations. Compared to the baseline decisions of the control group, those in the treatment group shifted their decisions by 16.2%, 12.3%, and 8.0% depending on whether they viewed Prezi, PowerPoint, or oral presentations, respectively. The non- or marginal significance of some between-format comparisons (e.g., PowerPoint versus Prezi) is difficult to interpret. We hesitate to dismiss these differences as statistical noise given their general alignment with rating results, as well as the correlation between business decisions and presentation ratings (which do vary significantly with format). For the more objective outcome of decision-making, we can, at the very least, provisionally conclude that Prezi presentations are more effective than oral presentations, and that software-aided presentations are more effective than oral presentations.

We did not find any evidence that the presentations affected participants’ memory or understanding of the case, nor did we find evidence that certain presentation formats impacted learning more than others. Given the goals of the presentations and design of the experiment, however, we hesitate to draw any conclusions from these null results.

General discussion

The most important finding across the two experiments is easy to summarize: Participants evaluated Prezi presentations as more organized, engaging, persuasive, and effective than both PowerPoint and oral presentations. This finding was true for both live and prerecorded presentations, when participants rated or ranked presentations, and when participants judged multiple presentations of different formats or only one presentation in isolation. Results from Experiment 2 demonstrate that these presentations influenced participants’ core judgments about a business decision, and suggest that Prezi may benefit both behavioral and experiential outcomes. We have no evidence, however, that Prezi (or PowerPoint or oral presentations) facilitate learning in either presenters or their audience.

Several uninteresting explanations exist for the observed Prezi effects, none of which posit any specific efficacy of Prezi or ZUIs in general: namely, novelty, bias, and experimenter effects. We consider each in turn.

Novelty heavily influences both attention and memory [ 87 , 88 ], and the benefits of new media have sometimes dissipated over time—just as one would expect with novelty effects [ 3 ]. However, we found no evidence that novelty explains the observed benefits of Prezi: Participants who were less familiar with Prezi did not evaluate Prezi presentations more favorably, and only a small fraction of participants who favored Prezi explained their preference in terms of novelty. We therefore are skeptical that mere novelty can explain the observed effects.

We also considered the possibility that participants had a pre-existing bias for Prezi. This seems unlikely because presenter participants were selected based only on minimal experience with both PowerPoint and Prezi and were assigned randomly to the experimental groups; audience participants from both experiments were selected based merely on high-speed internet access, and the words “Prezi” and “PowerPoint” were not used in any audience recruitment material. In fact, both sets of participants entered the research with biases against Prezi, not for Prezi: They reported more experience with PowerPoint and oral presentations than Prezi, and perceived PowerPoint and oral presentations as more (not less) efficacious than Prezi. Thus, we reject the idea that the results simply reflect pre-existing media biases.

For many reasons, we also find it unlikely that experimenter effects—including demand characteristics (i.e., when participants conform to the experimenters’ expectations)—can explain the observed effects. First, at the outset we did not have strong hypotheses about the benefits of one format over the others. Second, the results are subtle in ways that neither we nor a demand characteristics hypothesis would predict: the effects on subjective experience diverged somewhat from the effects on decision-making, and there were no memory or comprehension effects. Third, the between-participants design of Experiment 2 (and between-participants analysis of Experiment 1 ) limited participants’ exposure to a single presentation format, thereby minimizing their ability to discern the experimental manipulation or research hypotheses. Fourth, we ensured that the presentations were equally high-quality; we did not unconsciously select Prezi presentations that happened to be higher quality than presentations in the other formats. Fifth, the random assignment of presenters to format limits the possible confounding of presenter variables with presentation formats or qualities; and no confounding with format was observed in presenters’ preexisting beliefs, prior experience, or demographics. And finally, in Experiment 2 we only explicitly mentioned or asked participants questions about Prezi, PowerPoint, and oral presentations at the conclusion of the experiment, after collecting all key outcome data.

We therefore conclude that the observed effects are not confounds or biases, but instead reflect a true and specific benefit of Prezi over PowerPoint or, more generally, ZUIs over slideware. If, however, these experimental effects merely reveal that Prezi is more user-friendly than PowerPoint—or that PowerPoint’s default templates encourage shallow processing by “[fetishizing] the outline at the expense of the content” [ 89 ] (pB26)—then we have learned little about the practice or psychology of communication. But if these effects instead reflect intrinsic properties of ZUIs or slideware, then they reveal more interesting and general insights about effective communication.

It is difficult to understand Prezi’s benefits in terms of user-friendliness because the odds were so clearly stacked in PowerPoint’s favor. Presenters were much more experienced in using PowerPoint than Prezi and rated PowerPoint as easier to use than Prezi. Especially given the task constraints—participants only had 45 minutes to prepare for a 5-minute presentation on a relatively new, unfamiliar topic—Prezi’s user interface would have to be improbably superior to PowerPoint’s interface to overcome these handicaps. Moreover, participants’ prior experience with PowerPoint or Prezi did not correlate with their success as presenters, as one would expect under an ease-of-use explanation. Finally, audience participants did not simply favor the Prezi presentations in an even, omnibus sense—they evaluated Prezi as better in particular ways that align with the purported advantages of ZUIs over slideware. This pattern of finding makes most sense if the mechanism were at the level of media, not software.

Participants’ evaluations of Prezi were particularly telling in three ways. First, in participants’ own words (from Experiment 1 ), they frequently described Prezi as engaging , interactive , visually compelling , visually pleasing , or vivid , and PowerPoint as concise , clear , easy to follow , familiar , professional , or organized . Second, in participants’ ratings (from Experiment 2 ), the visuals from Prezi presentations were evaluated as significantly more dynamic, visually compelling, and distinctive than those from PowerPoint presentations. And third, in judging the audiovisual attributes of presentations, participants’ identified animations as both the attribute most lacking in presentations and the attribute that most distinguished Prezi from PowerPoint; furthermore, the more a presentation was judged as lacking animation, the worse it was rated. Taken together, this evidence suggests that Prezi presentations were not just better overall, but were better at engaging visually with their audience through the use of animation. Because ZUIs are defined by their panning and zooming animations—and animation is an ancillary (and frequently misused) feature of slideware—the most parsimonious explanation for the present results is in terms of ZUIs and slideware in general, not Prezi and PowerPoint in particular. The medium is not the message, but it may be the mechanism.

The animated nature of ZUIs makes more sense as possible mechanism for the observed effects when one considers relevant literature on animation. Past research has shown that animation can induce physiological and subjective arousal (e.g., [ 90 , 91 ]) and facilitate attention, learning, and task performance (e.g., [ 92 – 94 ]; but see also [ 95 , 96 ]). Most pertinently, people appear to prefer animated media over static media. Participants rate animated online advertisements as more enjoyable, persuasive, effective, and exciting than static online advertisements [ 97 , 98 ], animated websites as more likeable, engaging, and favorable than static websites [ 99 ], and animated architectural displays as clearer than static displays [ 100 ]. In an experiment of online academic lectures, participants preferred whiteboard-style animations over a slideware-style version matched for both visual and audio content [ 101 ]. Moreover, ZUI’s use of animation aligns with recommended principles for using animation effectively in presentations, which include the creation of a large virtual canvas and the use of zooming to view detail [ 102 ]. Slideware, on the other hand, encourages the use of superfluous animation in slide transitions and object entrances/exits, despite evidence that adding such “seductive details” to multimedia presentations can be counterproductive [ 72 ].

Therefore, we not only conclude that audiences prefer Prezi over PowerPoint presentations, but also conclude that their preference is rooted in an intrinsic attribute of ZUIs: panning and zooming animations. Compared to slideware’s sequential, linear transitions (and oral presentations’ total lack of visual aids), zooming and panning over a virtual canvas is a more engaging and enjoyable experience for an audience.

From this perspective, the reason that participants rated Prezi presentations as more persuasive, effective, and organized than other presentations—and Prezi presenters as more knowledgeable, professional, effective, and organized than other presenters—was because they confuse media with messages and messengers. Dual-process models of persuasion contend that opinion change occurs through not just slow deliberations grounded in logic and reason but also through fast shortcuts rooted in associations and cues [ 103 – 106 ]. If better presenters with better arguments tend to give better presentations, then an audience’s experience while viewing a presentation may shade their judgments about its presenter or argument. This is the same basic logic of research that demonstrates PowerPoint’s persuasion advantage over oral presentations [ 53 , 54 ]. Just as audiences appear more persuaded by slideware than by oral presentations, they also appear more persuaded by ZUI than by slideware presentations. But unlike past research, we do not argue that audience members use technological sophistication as a cue for argument quality [ 53 ] or presenter preparedness [ 54 ]; instead, we suggest that they use their subjective viewing experience as a heuristic for judging both presentations and presenters. Because ZUI presentations are more engaging than slideshows, ZUI presentations and presenters are judged more positively than slideshows.

Concluding remarks

Media research, including research into presentation software, is plagued methodologically by a lack of experimental control, the unjustifiable assumption that media effects are constant across individuals and content, and a failure to account for the biases of all involved: the presenters, the audiences, and the researchers. In the research reported here we strived to overcome these challenges by randomly assigning presenters and audience members to competing presentation formats, blinding them to the experimental manipulations, and sampling a sufficient array of presentations within each format.

Our conclusions about the advantages of ZUIs (such as Prezi) over slideware (such as PowerPoint) and oral presentations are, of course, tentative. Further research will need to replicate the findings across different presentation contexts, clarify whether the subjective benefits of ZUIs over slideware result in decision-making or behavioral advantages, and better investigate the precise media attributes responsible for these advantages. Like others [ 107 ], we caution against technological determinism: Presentation medium is but one of many factors that determine presentation success, and presentations that rely on any given medium can succeed or fail. Because slideware can be used to zoom and pan over a virtual canvas just as ZUIs can be used to create slideshows, the benefits of ZUIs over slideware are ultimately based on affordances: How much do certain formats encourage or enable psychologically advantageous media attributes, such as zooming and panning animations?

In many ways, it is surprising that we found any effects of presentation medium. The presentations differed in many ways aside from their format, ways that surely influenced their effectiveness: Each presentation was made by a different person (sampled from a diverse pool of participants), presenters chose what content to include in their presentation, and presenters decided how to convey that content within their assigned format. Under real-world circumstances in which presentations of different formats are actually contrasted with each other, we expect this background “noise” to be greatly reduced and impact of format correspondingly greater.

Supporting information

S1 file. experiment 1 audience pre-survey..

https://doi.org/10.1371/journal.pone.0178774.s001

S2 File. Experiment 1 audience post-survey.

https://doi.org/10.1371/journal.pone.0178774.s002

S3 File. Experiment 1 presenter pre-survey.

https://doi.org/10.1371/journal.pone.0178774.s003

S4 File. Experiment 1 presenter post-survey.

https://doi.org/10.1371/journal.pone.0178774.s004

S5 File. Experiment 2 audience post-survey.

https://doi.org/10.1371/journal.pone.0178774.s005

Acknowledgments

We would like to thank Erin-Driver Linn, Brooke Pulitzer, and Sarah Shaughnessy of the Harvard Initiative for Learning and Teaching for their institutional guidance and support, Nina Cohodes, Gabe Mansur, and the staff of the Harvard Decision Sciences Laboratory for their assistance with participant testing, Michael Friedman for his feedback on pilot versions of the study protocol, and Tom Ryder for his support in adapting the multimedia case for research purposes.

Author Contributions

  • Conceptualization: SMK ST STM.
  • Data curation: ST STM.
  • Formal analysis: ST STM.
  • Funding acquisition: SMK ST STM.
  • Investigation: ST.
  • Methodology: SMK ST STM.
  • Project administration: ST STM.
  • Resources: ST STM.
  • Software: ST STM.
  • Supervision: SMK ST STM.
  • Validation: SMK ST STM.
  • Visualization: STM.
  • Writing – original draft: STM.
  • Writing – review & editing: SMK ST STM.
  • 1. McCluhan M. The medium is the message. In: Understanding of media: The extensions of man. 1964. p. 1–18.
  • 2. Salomon G. Interaction of media, cognition, and learning. Lawrance Erlbaum Associates Inc.; 1994.
  • View Article
  • Google Scholar
  • 5. Stokes DE. Pasteur’s Quadrant. Basic science and technological innovation. Washington, DC: Brookings Institution Press; 1997.
  • 7. Parker I. Absolute PowerPoint: Can a software package edit our thoughts? [Internet]. The New Yorker. 2001 [cited 2016 Sep 3]. Available from: http://www.newyorker.com/magazine/2001/05/28/absolute-powerpoint
  • 10. Tufte ER. The cognitive style of PowerPoint. Cheshire, CT: Graphics Press; 2013.
  • PubMed/NCBI
  • 42. Frey BA, Birnbaum DJ. Learners’ perceptions on the value of PowerPoint in lectures. 2002 [cited 2016 Sep 5]; Available from: http://eric.ed.gov/?id=ED467192
  • 53. Guadagno RE, Sundie JM, Hardison TA, Cialdini RB. The persuasive power of PowerPoint ® presentations. In: Proceedings of the 6th International Conference on Persuasive Technology: Persuasive Technology and Design: Enhancing Sustainability and Health [Internet]. ACM; 2011 [cited 2015 Dec 26]. p. 2. Available from: http://dl.acm.org/citation.cfm?id=2467805
  • 55. Kahneman D. Thinking, Fast and Slow. Macmillan; 2011. 511 p.
  • 56. Gunelius S. Stand Out From Competitors With Prezi Presentations [Internet]. Forbes. 2011 [cited 2016 Sep 5]. Available from: http://www.forbes.com/sites/work-in-progress/2011/03/23/stand-out-from-competitors-with-prezi-presentations/
  • 58. Rockinson-Szapkiw AJ, Knight A, Tucker JM. Prezi: Trading linear presentations for conceptual learning experiences in counselor education. In 2011 [cited 2016 Sep 3]. Available from: https://works.bepress.com/amanda_rockinson_szapkiw/18/
  • 61. Adams S. How Prezi’s Peter Arvai plans to beat PowerPoint. Forbes [Internet]. 2016 [cited 2016 Sep 5]; Available from: http://www.forbes.com/sites/forbestreptalks/2016/06/07/how-prezis-peter-arvai-plans-to-beat-powerpoint/
  • 62. McCloud S. Reinventing comics. New York, NY, USA: Paradox Pres; 2000.
  • 67. Wilson ML, others. A longitudinal study of exploratory and keyword search. In: Proceedings of the 8th ACM/IEEE-CS joint conference on Digital libraries [Internet]. ACM; 2008 [cited 2016 Aug 16]. p. 52–56. Available from: http://dl.acm.org/citation.cfm?id=1378899
  • 69. Luria AR. The mind of a mnemonist: A little book about a vast memory. Harvard University Press; 1968.
  • 72. Mayer RE. Multimedia learning. In: Brian H. Ross , editor. Psychology of Learning and Motivation [Internet]. Academic Press; 2002 [cited 2013 Dec 19]. p. 85–139. Available from: http://www.sciencedirect.com/science/article/pii/S0079742102800056
  • 75. Brock S, Brodahl C. A Tale of two cultures: Cross cultural comparison in learning the Prezi presentation software tool in the US and Norway. In: Proceedings of the Informing Science and Information Technology Education Conference [Internet]. 2013 [cited 2015 Dec 26]. p. 95–119. Available from: http://www.editlib.org/p/114622/
  • 78. Ballentine BD. High concept and design documentation: Using Prezi for undergraduate game design. In: 2012 IEEE International Professional Communication Conference. 2012. p. 1–5.
  • 82. Bean JW. Presentation software supporting visual design: Displaying spatial relationships with a zooming user interface. Professional Communication Conference (IPCC), 2012 IEEE International 2012 Oct 8 (pp. 1–6). IEEE.
  • 85. Harvard Business Publishing—Cases [Internet]. [cited 2016 Sep 10]. Available from: https://cb.hbsp.harvard.edu/cbmp/pages/content/cases
  • 94. Shanmugasundaram M, Irani P. The effect of animated transitions in zooming interfaces. In: Proceedings of the Working Conference on Advanced Visual Interfaces [Internet]. New York, NY, USA: ACM; 2008 [cited 2016 Sep 5]. p. 396–399. (AVI ‘08). Available from: http://doi.acm.org/10.1145/1385569.1385642
  • 101. Turkay S, Moulton ST. The educational impact of whiteboard animations: An experiment using popular social science lessons. In: Proceedings of the 7th International Conference of Learning International Networks Consortium (LINC). Cambridge, MA, USA; 2016. p. 283–91.
  • 103. Chaiken S, Liberman A, Eagly AH. Heuristic and systematic information processing within and. Unintended thought. In: Unintended Thought. New York, NY, USA: The Guilford Press; 1989. p. 212–52.
  • 104. Gilovich T, Griffin DW, Kahneman D. Heuristics and biases: the psychology of intuitive judgement. Cambridge, U.K.; New York: Cambridge University Press; 2002.
  • 105. Petty RE, Cacioppo JT. Communication and persuasion: central and peripheral routes to attitude change. New York: Springer-Verlag; 1986.
  • 106. Petty RE, Wegener DT. The elaboration likelihood model: Current status and controversies. In: Chaiken S, Trope Y, editors. Dual-process theories in social psychology. New York, NY, US: Guilford Press; 1999. p. 37–72.

slides icon

Cloud Storage

gmail icon

Custom Business Email

Meet icon

Video and voice conferencing

calendar icon

Shared Calendars

docs icon

Word Processing

sheets icon

Spreadsheets

Presentation Builder

forms icon

Survey builder

google workspace

Google Workspace

An integrated suit of secure, cloud-native collaboration and productivity apps powered by Google AI.

Tell impactful stories, with Google Slides

Create, present, and collaborate on online presentations in real-time and from any device.

  • For my personal use
  • For work or my business

icon for add comment button

Jeffery Clark

T h i s   c h a r t   h e l p s   b r i d g i n g   t h e   s t o r y !

comment box buttons

E s t i m a t e d   b u d g e t

Cursor

Make beautiful presentations, together

Stay in sync in your slides, with easy sharing and real-time editing. Use comments and assign action items to build your ideas together.

Slides create presentations

Present slideshows with confidence

With easy-to-use presenter view, speaker notes, and live captions, Slides makes presenting your ideas a breeze. You can even present to Google Meet video calls directly from Slides.

Slides present with confidence

Seamlessly connect to your other Google apps

Slides is thoughtfully connected to other Google apps you love, saving you time. Embed charts from Google Sheets or reply to comments directly from Gmail. You can even search the web and Google Drive for relevant content and images directly from Slides.

Slides connect to Google apps

Extend collaboration and intelligence to PowerPoint files

Easily edit Microsoft PowerPoint presentations online without converting them, and layer on Slides’ enhanced collaborative and assistive features like comments, action items, and Smart Compose.

Slides connect to Google apps

Work on fresh content

With Slides, everyone’s working on the latest version of a presentation. And with edits automatically saved in version history, it’s easy to track or undo changes.

Design slides faster, with built-in intelligence

Make slides faster, with built-in intelligence

Assistive features like Smart Compose and autocorrect help you build slides faster with fewer errors.

Stay productive, even offline

Stay productive, even offline

You can access, create, and edit Slides even without an internet connection, helping you stay productive from anywhere.

Security, compliance, and privacy

badge ISO IEC

Secure by default

We use industry-leading security measures to keep your data safe, including advanced malware protections. Slides is also cloud-native, eliminating the need for local files and minimizing risk to your devices.

Encryption in transit and at rest

All files uploaded to Google Drive or created in Slides are encrypted in transit and at rest.

Compliance to support regulatory requirements

Our products, including Slides, regularly undergo independent verification of their security, privacy, and compliance controls .

Private by design

Slides adheres to the same robust privacy commitments and data protections as the rest of Google Cloud’s enterprise services .

privacy icon

You control your data.

We never use your slides content for ad purposes., we never sell your personal information to third parties., find the plan that’s right for you, google slides is a part of google workspace.

Every plan includes

keep icon

Collaborate from anywhere, on any device

Access, create, and edit your presentations wherever you are — from any mobile device, tablet, or computer — even when offline.

Google Play store

Get a head start with templates

Choose from a variety of presentations, reports, and other professionally-designed templates to kick things off quickly..

Slides Template Proposal

Photo Album

Slides Template Photo album

Book Report

Slides Template Book report

Visit the Slides Template Gallery for more.

Ready to get started?

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • CBE Life Sci Educ
  • v.17(1); Spring 2018

Scientific Presenting: Using Evidence-Based Classroom Practices to Deliver Effective Conference Presentations

Lisa a. corwin.

† Department of Ecology & Evolutionary Biology, University of Colorado, Boulder, Boulder, CO 80309

Amy Prunuske

‡ Department of Microbiology and Immunology, Medical College of Wisconsin–Central Wisconsin, Wausau, WI 54401

Shannon B. Seidel

§ Biology Department, Pacific Lutheran University, Tacoma, WA 98447

Scientific presenting is the use of scientific teaching principles—active learning, equity, and assessment—in conference presentations to improve learning, engagement, and inclusiveness. This essay presents challenges presenters face and suggestions for how presenters can incorporate active learning strategies into their scientific presentations.

Scientists and educators travel great distances, spend significant time, and dedicate substantial financial resources to present at conferences. This highlights the value placed on conference interactions. Despite the importance of conferences, very little has been studied about what is learned from the presentations and how presenters can effectively achieve their goals. This essay identifies several challenges presenters face when giving conference presentations and discusses how presenters can use the tenets of scientific teaching to meet these challenges. We ask presenters the following questions: How do you engage the audience and promote learning during a presentation? How do you create an environment that is inclusive for all in attendance? How do you gather feedback from the professional community that will help to further advance your research? These questions target three broad goals that stem from the scientific teaching framework and that we propose are of great importance at conferences: learning, equity, and improvement. Using a backward design approach, we discuss how the lens of scientific teaching and the use of specific active-learning strategies can enhance presentations, improve their utility, and ensure that a presentation is broadly accessible to all audience members.

Attending a conference provides opportunities to share new discoveries, cutting-edge techniques, and inspiring research within a field of study. Yet after presenting at some conferences, you might leave feeling as though you did not connect with the audience, did not receive useful feedback, or are unsure of where you fit within the professional community. Deciding what to cover in a presentation may be daunting, and you may worry that the audience did not engage in your talk. Likewise, for audience members, the content of back-to-back talks may blur together, and they may get lost in acronyms or other unfamiliar jargon. Audience members who are introverted or new to the field may feel intimidated about asking a question in front of a large group containing well-known, outspoken experts. After attending a conference, one may leave feeling curious and excited or exhausted and overwhelmed, wondering what was gained from presenting or attending.

Conferences vary widely in purpose and location, ranging from small conferences hosted within home institutions to large international conferences featuring experts from around the world. The time and money spent to host, attend, and present at conferences speaks to the value placed on engaging in these professional interactions. Despite the importance of conferences to professional life, there is rarely time to reflect on what presenters and other conference attendees learn from participating in conferences or how conferences promote engagement and equity in the field as a whole. A significant portion of most conference time is devoted to the delivery of oral presentations, which traditionally are delivered in a lecture style, with questions being initiated by a predictable few during question-and-answer sessions.

In this essay, we discuss how you can use a backward design approach and scientific presenting strategies to overcome three key challenges to effectively presenting to diverse conference audiences. The challenges we consider here include the following:

  • Engagement in learning: ensuring that your audience is engaged and retains what is important from a talk
  • Promoting equity: creating an environment that is inclusive of all members of the research field
  • Receiving feedback: gathering input from the professional community to improve as a researcher and presenter

At conferences, learning and advancement of a field is paramount, similar to more formal educational settings. Thus, we wrote these presenting challenges to align with the central themes presented in the scientific teaching framework developed by Handelsman and colleagues (2007) . “Learning” aligns with “active learning,” “equity” with “diversity,” and “feedback” with “assessment.” Using the scientific teaching framework and a backward design approach, we propose using evidence-based teaching strategies for scientific presenting in order to increase learning, equity, and quality feedback. We challenge you , the presenter, to consider how these strategies might benefit your future presentations.

BACKWARD DESIGN YOUR PRESENTATION: GOALS AND AUDIENCE CONSIDERATIONS

How will you define the central goals of your presentation and frame your presentation based on these goals? Begin with the end in mind by clearly defining your presentation goals before developing content and activities. This is not unlike the process of backward design used to plan effective learning experiences for students ( McTighe and Thomas, 2003 ). Consider what you, as a presenter, want to accomplish. You may want to share results supporting a novel hypothesis that may impact the work of colleagues in your field or disseminate new techniques or methodologies that could be applied more broadly. You may seek feedback about an ongoing project. Also consider your audience and what you hope they will gain from attending. You may want to encourage your colleagues to think in new and different ways or to create an environment of collegiality. It is good to understand your audience’s likely goals, interests, and professional identities before designing your presentation.

How can you get to know these important factors about your audience? Although it may not be possible to predict or know all aspects of your audience, identify sources of information you can access to learn more about them. Conference organizers, the website for the conference, and previous attendees may be good sources of information about who might be in attendance. Conference organizers may have demographic information about the institution types and career stages of the audience. The website for a conference or affiliated society often describes the mission of the organization or conference. Finally, speaking with individuals who have previously attended the conference may help you understand the culture and expectations of your audience. This information may enable you to tailor your talk and select strategies that will engage and resonate with audience members of diverse backgrounds. Most importantly, reflecting on the information you gather will allow you to evaluate and better define your presentation goals.

Before designing your presentation, write between two and five goals you have for yourself or your audience (see Vignettes 1 and 2 for sample goals). Prioritize your goals and evaluate which can be accomplished with the time, space, and audience constraints you face. Once you have established both your goals and knowledge of who might attend your talk, it is time to design your talk. The Scientific Presenting section that follows offers specific design suggestions to engage the audience in learning, promote equity, and receive high-quality feedback.

Situation: Mona Harrib has been asked to give the keynote presentation at a regional biology education research conference. As a leader in the field, Mona is well known and respected, and she has a good grasp on where the field has been and where it is going now.

Presentation Goals: She has three goals she wants to accomplish with her presentation: 1) to introduce her colleagues to the self-efficacy framework, 2) to provide new members of her field opportunities to learn about where the field has been, and 3) to connect these new individuals with others in the field.

Scientific Presenting Strategy: Mona has 50 minutes for her presentation, with 10 minutes for questions. Her opening slide, displayed as people enter the room, encourages audience members to “sit next to someone you have not yet spoken to.” Because her talk will discuss self-efficacy theory and the various origins of students’ confidence in their ability to do science, she begins by asking the audience members to introduce themselves to their neighbors and to describe an experience in which they felt efficacious or confident in their ability to do something and why they felt confident.  She circulates around the room, and asks five groups to share their responses. This provides an audience-generated foundation that she uses to explain the framework in more detail. For historical perspective, she relates each framework component back to prior research in the field. She ends with some recent work from her research group and asks participants to discuss with their partners how the framework could be used to explain the results of her recent study. She again gathers and reports several examples to the whole group that illustrate ways in which the data might be interpreted. She then asks the audience to write a question that they still have about this research on note cards, which she collects and reviews after the presentation. After reviewing the cards, she decides to incorporate a little more explanation about a few graphs in her work to help future audiences digest the information.

Situation: Antonio Villarreal is a postdoctoral fellow at the University of California, Berkeley, who has recently been selected for a 15-minute presentation in the Endocytic Trafficking Minisymposium at the American Society for Cell Biology Annual Meeting. He has attended this conference twice, so he has a sense of the audience, space, and culture of the meeting. In his past experiences, he has found that the talks often blur together, and it is especially difficult to remember key ideas from the later talks in each session.

Presentation Goals: With a manuscript in preparation and his upcoming search for a faculty position, Antonio has the following three goals for his presentation: 1) to highlight the significance of his research in a memorable way; 2) to keep the audience engaged, because his presentation is the ninth out of 10 talks; and 3) to receive feedback that will prepare him to give professional job talks.

Scientific Presenting Strategy: Antonio took a class as a postdoctoral fellow about evidence-based practices in teaching and decides he would like to incorporate some active learning into his talk to help his audience learn. He worries that with only 15 minutes he does not have a lot of time to spare. So he sets up the background and experimental design for the audience and then projects only the two axes of his most impactful graph on the screen with a question mark in the middle where the data would be. Rather than simply showing the result, he asks the audience to turn to a neighbor and make a prediction about the results they expect to see. He cues the audience to talk to one another by encouraging them to make a bold prediction! After 30 seconds, he quells the chatter and highlights two different predictions he heard from audience members before sharing the results. At the end of his presentation, he asks the audience to turn to a neighbor once again and discuss what the results mean and what experiment they would try next. He also invites them to talk further with him after the session. The questions Antonio receives after his talk are very interesting and help him consider alternative angles he could pursue or discuss during future talks. He also asks his colleague Jenna to record his talk on his iPhone, and he reviews this recording after the session to prepare him for the job market.

SCIENTIFIC PRESENTING: USING A SCIENTIFIC TEACHING PERSPECTIVE TO DESIGN CONFERENCE PRESENTATIONS

Presenters, like teachers, often try to help their audiences connect new information with what they already know ( National Research Council, 2000 ). While a conference audience differs from a student audience, evidence and strategies collected from the learning sciences can assist in designing presentations to maximize learning and engagement. We propose that the scientific teaching framework, developed by Handelsman and colleagues (2007) to aid in instructional design, can be used as a tool in developing presentations that promote learning, are inclusive, and allow for the collection of useful feedback. In this section, we discuss the three pillars of scientific teaching: active learning, diversity, and assessment. We outline how they can be used to address the central challenges outlined earlier and provide specific tips and strategies for applying scientific teaching in a conference setting.

Challenge: Engagement in Learning

Consider the last conference you attended: How engaged were you in the presentations? How many times did you check your phone or email? How much did you learn from the talks you attended? Professional communities are calling for more compelling presentations that convey information successfully to a broad audience (e.g., Carlson and Burdsall, 2014 ; Langin, 2017 ). Active-learning strategies, when combined with constructivist approaches, are one way to increase engagement, learning, and retention ( Prince, 2004 ; Freeman et al. , 2014 ). While active-learning strategies are not mutually exclusive with the use of PowerPoint presentations in the dissemination of information, they do require thoughtful design, time for reflection, and interaction to achieve deeper levels of learning ( Chi and Wylie, 2014 ). This may be as simple as allowing 30–60 seconds for prediction or discussion in a 15-minute talk. On the basis of calls for change from conference goers and organizers and research on active-learning techniques, we have identified several potential benefits of active learning likely to enhance engagement in conference presentations:

  • Active learning increases engagement and enthusiasm. Active learning allows learners to maintain focus and enthusiasm throughout a learning experience (e.g., Michael, 2006 ). Use of active learning may particularly benefit audience members attending long presentations or sessions with back-to-back presenters.
  • Active learning improves retention of information. Active reflection and discussion with peers supports incorporation of information into one’s own mental models and creates the connections required for long-term retention of information (reviewed in Prince, 2004 ).
  • Active learning allows for increased idea exchange among participants. Collaborative discourse among individuals with differing views enhances learning, promotes argumentation, and allows construction of new knowledge ( Osborne, 2010 ). Active-learning approaches foster idea exchange and encourage interaction, allowing audience members to hear various perspectives from more individuals.
  • Active learning increases opportunities to build relationships and expand networks. Professional networking is important for expansion of professional communities, enhancing collaborations, and fostering idea exchange. Short collaborative activities during presentations can be leveraged to build social networks and foster community in a professional setting, similar to how they are used in instruction ( Kember and Leung, 2005 ; Kuh et al ., 2006 ).

While a multitude of ways to execute active learning exist, we offer a few specific suggestions to quickly engage the audience during a conference presentation ( Table 1 ). In the spirit of backward design, we encourage you to identify learning activities that support attainment of your presentation goals. Some examples can be found in Vignettes 1 and 2 (section 1), which illustrate hypothetical scenarios in which active learning is incorporated into presentations at professional conferences to help meet specific goals.

Active-learning strategies for conference presentations

Similar to giving a practice talk before the conference, we encourage you to test out active-learning strategies in advance, particularly if you plan to incorporate technology, because technological problems can result in disengagement ( Hatch et al ., 2005 ). Practicing presentation activities within a research group or local community will provide guidance on prompts, timing, instructions, and audience interpretation to identify problems and solutions before they occur during a presentation. This will help to avoid activities that are overly complex or not purpose driven ( Andrew et al ., 2011 ).

Challenge: Equity and Participation

Consider the last conference you attended: Did you hear differing opinions about your work or did the dominant paradigms prevail? Who asked questions; was it only high-status experts in the field? Did you hear from multiple voices? Did newer members, like graduate students and postdoctoral fellows, engage with established members of the community? In classroom settings, equity and diversity strategies improve learning among all students and particularly support students from underrepresented groups in science by decreasing feelings of exclusion, alleviating anxiety, and counteracting stereotype threat ( Haak et al ., 2011 ; Walton et al ., 2012 ; Eddy and Hogan, 2014 ). Likewise, in a conference setting, strategies that promote equitable participation and recognize the positive impact of diversity in the field may help increase equity more broadly and promote a sense of belonging among participants. Conference audiences are oftentimes even more diverse than the typical classroom environment, being composed of individuals from different disciplines, career stages, and cultures. Incorporating strategies that increase the audience’s understanding and feelings of inclusion in the professional community may impact whether or not an individual continues to engage in the field. We have identified three central benefits of equity strategies for presenters and their professional communities:

  • Equity strategies increase accessibility and learning. As a presenter, you should ensure that presentations and presentation materials 1) allow information to be accessed in various forms, so that differently abled individuals may participate fully, and 2) use straightforward language and representations. You can incorporate accessible versions of conference materials (e.g., captioned videos) or additional resources, such as definitions of commonly used jargon necessary to the presentation (e.g., Miller and Tanner, 2015 ). You may consider defining jargon or acronyms in your talk to increase accessibility for individuals who might struggle to understand the full meaning (e.g., new language learners or individuals who are new to the field).

Equity strategies for conference presentations (as presented in Tanner, 2013 )

  • Equity strategies promote a sense of belonging among all individuals. Creating a welcoming, inclusive environment will make the community attractive to new members and help to increase community members’ sense of belonging. Sense of belonging helps individuals in a community to view themselves as valued and important, which serves to motivate these individuals toward productive action. This increases positive affect, boosts overall community morale, and supports community development ( Winter-Collins and McDaniel, 2000 ). Belonging can increase if it is specifically emphasized as important and if individuals make personal connections to others, such as during small-group work (see Table 2 ).

Many strategies discussed in prior sections, such as using active learning and having clear goals, help to promote equity, belonging, and access. In Table 2 , we expand on previously mentioned strategies and discuss how specific active-learning and equity strategies promote inclusiveness. These tips for facilitation are primarily drawn from Tanner’s 2013 feature on classroom structure, though they apply to the conference presentation setting as well.

An overarching goal of conferences is to help build a thriving, creative, inclusive, and accessible community. Being transparent about which equity strategies you are using and why you are using them may help to promote buy-in and encourage others to use similar strategies. By taking the above actions as presenters and being deliberate in our incorporation of equity and diversity strategies, we can help our professional communities to thrive, innovate, and grow.

Challenge: Receiving Feedback

Consider your last conference presentation: What did you take away from the presentation? Did you gather good ideas during the session? Were the questions and comments you received useful for advancing your work? If you were to present this work again, what changes might you make? In the classroom, assessment drives learning of content, concepts, and skills. At conferences, we, as presenters, take the role of instructor in teaching our peers (including members of our research field) about new findings and innovations. However, assessment at conferences differs from classroom assessment in important ways. First, at conferences, you are unlikely to present to the same audience multiple times; therefore, the focus of the assessment is purely formative—to determine whether the presentation accomplished its goals. This feedback can aid in your professional development toward being an effective communicator. Second, at conferences, you are speaking to a diverse group of colleagues who have varied expertise, and feedback from the audience will provide information that may improve your research. Indeed, conferences are a prime environment to draw on a diversity of expertise to identify relevant information, resources, and alternative interpretations of data. These characteristics of conferences give rise to three possible types of presentation assessments ( Hattie and Timperley, 2007 ):

  • Feed-up: assessment of the achievement of presentation goals: Did I achieve my goals as a presenter?
  • Feed-back: assessment of whether progress toward project goals is being achieved: Is my disciplinary work or research progressing effectively?
  • Feed-forward: input on which activities should be undertaken next: What are the most important next steps in this work for myself and my professional community?

Though a lot of feedback at conferences occurs in informal settings, you can take the initiative to incorporate assessment strategies into your presentation. Many of the simple classroom techniques described in the preceding sections, like polling the audience and hearing from multiple voices, support quick assessment of presentation outcomes ( Angelo and Cross, 1993 ). In Table 3 , we elaborate on possible assessment strategies and provide tips for gathering effective feedback during presentations.

Assessment and feedback strategies for conference presentations

a We suggest these questions for a simple, yet informative postpresentation feed-up survey about your presentation: What did you find most interesting about this presentation? What, if anything, was unclear or were you confused about (a.k.a. muddiest point)? What is one thing that would improve this presentation? Similarly, to gather information for a feedback/feed-forward assessment, we recommend: What did you find most interesting about this work? What about this project needs improvement or clarification? What do you consider an important next step that this work might take?

Technology can assist in implementing assessment, and we predict that there will be many future technological innovations applicable to the conference setting. Live tweeting or backchanneling is occurring more frequently alongside presentations, with specific hashtags that allow audience members to initiate discussions and generate responses from people who are not even in the room ( Wilkinson et al ., 2015 ). After the presentation, you and your audience members can continue to share feedback and materials through email list servers and QR codes. Self-assessment by reviewing a video from the session can support both a better understanding of audience engagement and self-reflection ( van Ginkel et al ., 2015 ). There are benefits from gathering data from multiple assessment strategies, but as we will discuss in the next section, there are barriers that impact the number of recommended learning, equity, and assessment strategies you might chose to implement.

NAVIGATING BARRIERS TO SCIENTIFIC PRESENTING

Though we are strong advocates for a scientific presenting approach, there are several important barriers to consider. These challenges are similar to what is faced in the classroom, including time, space, professional culture, and audience/student expectations.

One of the most important barriers to consider is culture, as reflected in the following quote:

Ironically, the oral presentations are almost always presented as lectures, even when the topic of the talk is about how lecturing is not very effective! This illustrates how prevalent and influential the assumptions are about the expected norms of behavior and interaction at a scientific conference. Even biologists who have strong teaching identities and are well aware of more effective ways to present findings choose, for whatever reason (professional culture? professional identity?), not to employ evidence-based teaching and communication methods in the venue of a scientific conference. ( Brownell and Tanner, 2012 , p. 344)

As this quote suggests, professional identity and power structures exist within conference settings that may impact the use of scientific presenting strategies. Trainees early in their careers will be impacted by disciplinary conference norms and advisor expectations and should discuss incorporating new strategies with a trusted mentor. In addition, incorporating scientific presenting strategies can decrease your control as a presenter and may even invoke discomfort and threaten your or your audience’s professional identities.

Balance between content delivery and active engagement presents another potential barrier. Some may be concerned that active learning takes time away from content delivery or that using inclusive practices compromises the clarity of a central message. Indeed, there is a trade-off between content and activity, and presenters have to balance presenting more results with time spent on active learning that allows the audience to interpret the results. We suggest that many of these difficulties can be solved by focusing on your goals and audience background, which will allow you to identify which content is critical and hone your presentation messaging to offer the maximum benefit to the audience. Remember that coverage of content does not ensure learning or understanding and that you can always refer the audience to additional content or clarifying materials by providing handouts or distributing weblinks to help them engage as independent learners.

Physical space and time may limit participants’ interaction with you and one another. Try to view your presentation space in advance and consider how you will work within and possibly modify that space. For example, if you will present in a traditional lecture hall, choose active learning that can be completed by an individual or pairs instead of a group. By being aware of the timing, place in the conference program, and space allotted, you can identify appropriate activities and strategies that will fit your presentation and have a high impact. Available technology, support, and resources will also impact the activities and assessments you can implement and may alleviate some space and time challenges.

As novice scientific presenters dealing with the above barriers and challenges, “failures” or less than ideal attempts at scientific presenting are bound to occur. The important thing to remember is that presenting is a scientific process, and just as experiments rarely work perfectly the first time they are executed, so too does presenting in a new and exciting way. Just as in science, challenges and barriers can be overcome with time, iterations, and thoughtful reflection.

STRUCTURING A CONFERENCE TO FACILITATE SCIENTIFIC PRESENTING

Although presenters can opt to use backward design and incorporate scientific presenting strategies, they do not control other variables like the amount of time allotted to each speaker, the size or shape of the room they present in, or the technology available. These additional constraints are still important and may impact a presenter’s ability to use audience-centered presentation methods. Conference organizers are in a powerful position to support presenters’ ability to implement the described strategies and to provide the necessary logistical support to maximize the likelihood of success. Organizers often set topics, determine the schedule, book spaces, identify presenters, and help establish conference culture.

So how can conference organizers affect change that will promote active engagement and equity in conference presentations?

  • Use backward design for the conference as a whole. Just as presenters can use backward design to set their specific learning goals, conference organizers can set goals for the meeting as a whole to support the conference community.
  • Vary conference structures and formats based on the needs of the community. Conference presentation structures vary widely, but it is worth considering why certain session structures are used. To what extent does it serve the community to have back-to-back 10-minute talks for several hours? Many people will have the chance to present, but does the audience gain anything? Are there topics that would be better presented in a workshop format or a roundtable discussion? What other structures might benefit the conference community and their goals?
  • Choose a space that is conducive to active presentations or consider creative ways to use existing spaces. The spaces available for conferences are typically designed for lecture formats. However, organizers can seek out spaces that facilitate active presenting by choosing rooms with adaptable formats in which furniture can be moved to facilitate small-group discussions. They can also provide tips on how to work within existing spaces, such as encouraging participants to sit near the front of a lecture hall or auditorium.
  • Give explicit expectations to presenters. Organizers could inform presenters that active, engaging, evidence-based sessions are encouraged or expected. This will help cultivate the use of scientific presenting within the community.
  • Provide examples or support for presenters to aid in design of active learning, equity strategies, and assessment. Videos with examples of the described techniques, a quick reference guide, or access to experts within the field who would be willing to mentor presenters could be critical for supporting a conference culture that uses scientific presenting. For example, researchers at the University of Georgia have developed a repository of active-learning videos and instructions for instructors interested in developing these skills (REALISE—Repository for Envisioning Active-Learning Instruction in Science Education, https://seercenter.uga.edu/ realisevideos_howto ).
  • Collect evidence about conference structure and use it to inform changes. Surveying audience members and presenters to better understand the benefits and challenges of particular session formats can help inform changes over multiple years. Organizers should coordinate these efforts with presenters so they are aware of what data will be collected and disseminated back to them.

Although scientific teaching has increasingly become standard practice for evidence-based teaching of science courses, there are potentially great benefits for transforming our oral presentations in science and science education by incorporating the rigor, critical thinking, and experimentation that are regularly employed within research. The strategies suggested in this paper can serve as a starting point for experimentation and evaluation of presentation and conference efficacy. Using scientific presentation strategies may expedite the advancement of fields by increasing engagement and learning at conference presentations. Equity strategies can increase inclusion and community building among members of our research areas, which will help research fields to grow and diversify. Finally, regularly incorporating assessment into our presentations should improve the quality and trajectory of research projects, further strengthening the field. Both individual presenters and conference organizers have a role to play in shifting conference culture to tackle the challenges presented in this paper. We urge you to consider your role in taking action.

Acknowledgments

We thank Justin Hines, Jenny Knight, and Kimberly Tanner for thoughtful suggestions on early drafts of this article.

  • Andrew T. M., Leonard M. J., Colgrove C. A., Kalinowski S. T. (2011). Active learning not associated with student learning in a random sample of college biology courses . CBE—Life Sciences Education , ( 4 ), 394–405. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Angelo T. A., Cross K. P. (1993). Classroom assessment techniques: A handbook for college teachers . San Francisco: Jossey-Bass. [ Google Scholar ]
  • Bassett-Jones N. (2005). The paradox of diversity management, creativity and innovation . Creativity and Innovation Management , ( 2 ), 169–175. [ Google Scholar ]
  • Brownell S. E., Tanner K. D. (2012). Barriers to faculty pedagogical change: Lack of training, time, incentives, and … tensions with professional identity . CBE—Life Sciences Education , ( 4 ), 339–346. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Carlson M., Burdsall T. (2014). In-progress sessions create a more inclusive and engaging regional conference . American Sociologist , , 177. 10.1007/s12108-014-9220-2 [ CrossRef ] [ Google Scholar ]
  • Chi M. T. H., Wylie R. (2014). The ICAP framework: Linking cognitive engagement to active learning outcomes . Educational Psychologist , ( 4 ), 219–243. [ Google Scholar ]
  • Eddy S. L., Hogan K. A. (2014). Getting under the hood: How and for whom does increasing course structure work . CBE—Life Sciences Education , ( 3 ), 453–468. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Freeman S., Eddy S. L., McDonough M., Smith M. K., Okoroafor N., Jordt H., Wenderoth M. P. (2014). Active learning increases student performance in science, engineering, and mathematics . Proceedings of the National Academy of Sciences USA , ( 23 ), 8410–8415. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Haak D. C., HilleRisLambers J., Pitre E., Freeman S. (2011). Increased structure and active learning reduce the achievement gap in introductory biology . Science , ( 6034 ), 1213–1216. [ PubMed ] [ Google Scholar ]
  • Hacker K. (2013). Community-based participatory research . Los Angeles, CA: Sage. [ Google Scholar ]
  • Handelsman J., Miller S., Pfund C. (2007). Scientific teaching . New York: Macmillan. [ PubMed ] [ Google Scholar ]
  • Hatch J., Jensen M., Moore R. (2005). Manna from heaven or clickers from hell . Journal of College Science Teaching , ( 7 ), 36 [ Google Scholar ]
  • Hattie J., Timperley H. (2007). The power of feedback . Review of Educational Research , ( 1 ), 81–112. [ Google Scholar ]
  • Kember D., Leung D. Y. (2005). The influence of active learning experiences on the development of graduate capabilities . Studies in Higher Education , ( 2 ), 155–170. [ Google Scholar ]
  • Kuh G., Kinzie J., Buckley J., Bridges B. K., Hayek J. C. (2006). What matters to student success: A review of the literature (Commissioned Report for the National Symposium on Postsecondary Student Success: Spearheading a Dialog on Student Success . Retrieved July 3, 2017, from https://nces.ed.gov/npec/pdf/kuh_team_report.pdf .
  • Langin K. M. (2017). Tell me a story! A plea for more compelling conference presentations . The Condor , ( 2 ), 321–326. [ Google Scholar ]
  • McTighe J., Thomas R. S. (2003). Backward design for forward action . Educational Leadership , ( 5 ), 52–55. [ Google Scholar ]
  • Michael J. (2006). Where’s the evidence that active learning works . Advances in Physiology Education , ( 4 ), 159–167. [ PubMed ] [ Google Scholar ]
  • Miller S., Tanner K. D. (2015). A portal into biology education: An annotated list of commonly encountered terms . CBE—Life Sciences Education , ( 2 ), fe2. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Motschenbacher H. (2017). Inclusion and foreign language education . ITL—International Journal of Applied Linguistics , ( 2 ), 159–189. [ Google Scholar ]
  • National Research Council. (2000). How people learn: Brain, mind, experience, and school . (expanded ed.). Washington, DC: National Academies Press; https://doi.org/10.17226/9853 . [ Google Scholar ]
  • Osborne J. (2010). Arguing to learn in science: The role of collaborative, critical discourse . Science , ( 5977 ), 463–466. [ PubMed ] [ Google Scholar ]
  • Prince M. (2004). Does active learning work? A review of the research . Journal of Engineering Education , ( 3 ), 223–231. [ Google Scholar ]
  • Richard O. C., Shelor R. M. (2002). Linking top management team age heterogeneity to firm performance: Juxtaposing two mid-range theories . International Journal of Human Resource Management , ( 6 ), 958–974. [ Google Scholar ]
  • Rossi A. L., Lopez E. J. (2017). Contextualizing competence: Language and LGBT-based competency in health care . Journal of Homosexuality , ( 10 ), 1330–1349. [ PubMed ] [ Google Scholar ]
  • Tanner K. D. (2013). Structure matters: Twenty-one teaching strategies to promote student engagement and cultivate classroom equity . CBE—Life Sciences Education , ( 3 ), 322–331. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • van Ginkel S., Gulikers J., Biemans H., Mulder M. (2015). Towards a set of design principles for developing oral presentation competence: A synthesis of research in higher education . Educational Research Review , , 62–80. [ Google Scholar ]
  • Walton G. M., Cohen G. L., Cwir D., Spencer S. J. (2012). Mere belonging: The power of social connections . Journal of Personality and Social Psychology , ( 3 ), 513. [ PubMed ] [ Google Scholar ]
  • Wilkinson S. E., Basto M. Y., Perovic G., Lawrentschuk M., Murphy D. G. (2015). The social media revolution is changing the conference experience: Analytics and trends from eight international meetings . BJU International , ( 5 ), 839–846. 10.1111/bju.12910 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Winter-Collins A., McDaniel A. M. (2000). Sense of belonging and new graduate job satisfaction . Journal for Nurses in Professional Development , ( 3 ), 103–111. [ PubMed ] [ Google Scholar ]

google scholar

Google Scholar

Mar 31, 2019

1.22k likes | 2.88k Views

Please view on full screen . Press F5 on your keyboard. Google Scholar. Intro &amp; Search Tips. What’s a Google Scholar?. http:// scholar .google.com. No www !. Google Search vs. Google Scholar Search.

Share Presentation

  • a journal article book
  • full text article
  • many disciplines
  • additional features

bjorn

Presentation Transcript

Please view on full screen. Press F5 on your keyboard. Google Scholar Intro & Search Tips

What’s a Google Scholar? http://scholar.google.com No www! Google Search vs. Google Scholar Search Google Scholar searches for academic literature across many disciplines and sources: articles, theses, books, abstracts and court opinions, from academic publishers, professional societies, online repositories, universities and other web sites.

Getting Started – Auto Settings First time? Use the library link to confirm your UWest status. Then full text articles from library databases will show up automatically. You can also set this up without going through the library. See slide #6.

Getting Started – Optional Settings http://scholar.google.com Go to settings.

Getting Started – Optional Settings Personal preference: Easier to keep track of searches. Make it easier to download bibliographic data. Using Zotero? Select EndNote here. Remember to save!

Skip this slide if you came through the library link (slide 3) Getting Started – Manual Settings (2) Type uwest (1) Select library links. (3) Check this option. This will cross check your search results to see if our library has the full text.

The Search Type in your keywords “four noble truths" meditation “social learning theory” adults Put phrases in “” quotation marks when you want to find the words in that exact order because it has a special meaning when used together.

The Search - Advanced Click on the down arrow to go to advanced search

The Search - Advanced Same as using the “” quotation marks. When you choose in the title of the article, you will get a smaller number of results, but most likely very relevant because it uses the words in the title.

Filtering options These main links will most likely take you to a preview or ‘pay to access’ website. Any free & library affiliated links will be on the right side. Get an email when a new article about your search topic appears. If our library has the full text article, this link will show.

This item is a book. Author: first/middle name initials and full last name The website that contains the bibliographic information for this item. Published Year This item was mentioned in 81 other items. Click on this link to see those 81. Some additional features, see next slide. Other websites that list information about this item. Click here to download the bibliographic information into Reference Managers such as EndNote and Zotero. Articles with similar topics, but maybe not with the search words you used.

Get ‘More’ Since this item is a book, the ‘full text’ article link does not apply. But under ‘More’ you can see if we have it as a ebook or printed book. It will search to see if our library has an ebookfirst We have 2 copies. If not, search for library for printed copies.

Get ‘More’ (2) Library search will direct you to Worldcat, where you can find the nearest library that has this book. Cite will help you with the citation of this item in APA, MLA, Chicago style. • Be careful, sometimes they have inconsistent capitalization of titles.

Okay, that’s not too much info or anything...anything else? • It’s always good to know what you are looking at before you decide to read or search for the item. • So let’s see how you can easily tell what’s what in Google Scholar.

Identifying the Item Book ????? • For BOOKS, it will say [Book] at the front of the link. • What about the ???? Book Book Book Citation for ?????

Identifying the Item This is item “Gendered Religious Organizations...” is an article inside the journal: Gender & Society. When there is text between the author and the year, that is the Journal or Book Title. The item itself is inside that journal or book. What page? Click on ‘more’ then ‘cite’, and you will find the page number. • This item “Buddhism, Women, and Caste: the Case of the Newar Buddhists of the Kathmandu Valley” is a chapter inside a book titled: • Buddhist women and social justice.

What’s That? A Book! Book Title Publisher Author Year Book Title Nothing between Author and Year, so this item is most likely a book. Book Title

What’s That? A Journal Article/Book Chapter! Journal Title Article Title Book Title Chapter Title How do I know which is a book and which is a journal? Journal titles are shorter and/or more generic.

Also: Explore Our Databases • Google Scholar may provide many relevant results, but our databases (ATLA, ProQuestCentral) have features not available on Google Scholar: • Get Full Text Only results. • Limit search to Peer-Reviewed materials or specific journals. • Browsing different issues of a journal. • Filter by subject, language, journal, and much more.

We are a Work in Progress Contact us if you need help, or setup a one-on-one appointment: • http://lib.uwest.edu/about-library/contact-us • Suggestions & comments? • http://lib.uwest.edu/about-library/suggestions-comments

  • More by User

Compendex and Google Scholar

Compendex and Google Scholar

Compendex and Google Scholar. Presentation Notes. Presentation is intended to serve as a slide library for presenting value of Compendex on EV vs. Google Scholar Select slides most appropriate for your selling situation, e.g.:

854 views • 31 slides

GOOGLE SCHOLAR

GOOGLE SCHOLAR

WHAT IS GOOGLE SCHOLAR?. . Google Scholar: . Is a search engine that searches for scholarly literatureCan search across many disciplinesSearches for articles, theses, books, abstracts, court opinions from:Academic publishers, professional societies, online repositories, universities and other we

846 views • 50 slides

Google Scholar

236 views • 10 slides

GOOgle scholar

GOOgle scholar

GOOgle scholar. Friend or Foe? John Glover – VCU Libraries – February 11, 2011. 2004: yer not academic!. 2009: what the…?.

311 views • 12 slides

Google and Google Scholar

Google and Google Scholar

Google and Google Scholar . Roger Mills and Judy Reading May 2007 . Welcome to the Web. The world’s biggest haystack. What can you do in a haystack?. Romp Get hay fever Have unexpected encounters Sleep Not do research So what do you fancy?. Finding needles.

715 views • 53 slides

Google Scholar

Google Scholar. A Quick Guide. We have Google Scholar?. Yes we do! Even better, it is now linked to our Library’s electronic and print journal collection! Isn’t it great?

544 views • 18 slides

Google Scholar

Google Scholar. Tools, Tips, and Tricks. Ben Hockenberry Systems Librarian SJFC Lavery Library. In this Session, We’ll Answer:. What’s in Google Scholar (GS)? What are the pros and cons of using it? How do I search GS? How do I narrow my results? How do I link to Library Resources?

3.06k views • 19 slides

What is Google Scholar?

What is Google Scholar?

What is Google Scholar?.

516 views • 31 slides

Google Scholar

Google Scholar. By: Philip Schill. What is it?.  a function of the Google search engine that allows you to search through scholarly journals, books, legal documents, professional societies, and university websites

345 views • 8 slides

Google Scholar

Google Scholar. Soochow University Library Jane Chen. Outline. What is Google Scholar? When Google Scholar is best to use? Coverage in Google Scholar. What is Google Scholar?.

618 views • 25 slides

What is Google Scholar?

What is Google Scholar?. البوابة الاليكترونية.

341 views • 23 slides

GOOGLE SCHOLAR

GOOGLE SCHOLAR. Compiled by Helene van der Sandt. WHAT IS GOOGLE SCHOLAR?. Google Scholar:. Is a search engine that searches for scholarly literature Can search across many disciplines Searches for articles, theses, books, abstracts, court opinions from:

886 views • 50 slides

Google Scholar @ EWU

Google Scholar @ EWU

Google Scholar @ EWU. Setting Preferences Eastern Washington University Libraries. Click on the “Scholar Preferences” link. Enter “Eastern Washington University” in the Library Links box. 2. Click the “Find Library” button. Select the “Eastern Washington University – Find It @ EWU” option.

183 views • 10 slides

Use Google Scholar!

Use Google Scholar!

Use Google Scholar!. What the experts say:. Use Google Scholar Use simple search for articles on library homepage Better: in the digital library main screen select the ‘Specific subject’ Computer Science Search trough multiple databases at the same time (federated search, max. 10)

163 views • 4 slides

Google Scholar

Google Scholar. Can it really be used for bibliometrics?. Isobel Stark &amp; Michael Whitton June 2011. Google Scholar. Historical background? Google Scholar released (in beta) in 2004 Not the first freely available citation database (CiteSeer, Scirus, etc) Not subject specific

327 views • 20 slides

SCOPUS/GOOGLE SCHOLAR PROFILES

SCOPUS/GOOGLE SCHOLAR PROFILES

SCOPUS/GOOGLE SCHOLAR PROFILES. LIBRARY SUPPORT RESEARCH PROGRAMME. ANECA ASSESMENT INFORMATION HOW TO WRITE AND PUBLISH QUICK GUIDES SEMINARS. IRALIS (E-LIS ) Fecyt . Dialnet WOS, Scopus ORCID Researcher ID Mendeley Research Gate Google Scholar.

345 views • 30 slides

presentation on google scholar

Google Slides: How to make a phone-friendly, vertical presentation

W hile your presentation shows up thoroughly on a laptop, TV, monitor, or projector, the default landscape orientation doesn't play well on smartphones. If you plan to give a quick presentation on a smartphone or want to add a touch of novelty to stand out, use the steps below to use vertical orientation in Google Slides.

Apart from enhancing the mobile experience, vertical orientation simplifies the printing process, delivers a better flow of information, and makes your presentation stand out among other horizontal slides.

Although Google offers feature-rich Slides mobile apps on iPhone and Android , vertical orientation is only available on Google Slides for the web.

Use vertical orientation in Google Slides

You shouldn't create and complete a presentation in landscape mode and change the orientation at the end. It may mess up the graphical elements of your presentation. Follow the steps below to use vertical orientation in Google Slides.

  • Navigate to Google Slides on the web and open a presentation you want to edit.
  • Click File at the top and select Page setup .
  • Expand the top menu and find the default options. Standard 4:3 is ideal for viewing your presentation on a tablet. Widescreen 16:9 is suitable for viewing a slide on a TV, projector, or monitor. Widescreen 16:10 is the preferred dimension for viewing a presentation on modern laptops with taller displays.
  • None of the default options offer vertical orientation. Select Custom .
  • Expand the side menu and select Inches , Centimeters , Points , or Pixels . Let's select Inches .
  • Type 9 x 19.5 (preferred for modern smartphone displays) and select Apply . You can also select Pixels and type 1080 x 1920 (common on most Android phones).
  • You can check the entire site in a vertical orientation.

Now, you can use Google Slides features to create an ideal presentation.

How to autoplay your Google Slides presentation

Popular portrait orientation dimensions.

Whether you want to create a portrait presentation for printing or smartphones and smaller screens, glance over and memorize some common vertical slide sizes.

  • A3: 29.7 x 42 cm
  • A4: 21 x 29.7 cm
  • US Letter: 8.5 x 11 inches (identical to A4 size)
  • US Legal: 8.5 x 14 inches

Try the dimensions below to view a presentation on a smartphone or upload it to a social media network like Instagram, TikTok, or Snapchat.

  • iPhone 15 Pro: 1179 x 2556 pixels
  • Samsung Galaxy S24 Ultra: 1440 x 3120 pixels
  • Google Pixel 8 Pro: 1344 x 2992 pixels

If you don't want to deal with these unusual pixel numbers, use 1080 x 1920 pixels in the page setup menu for vertical slides.

When should you use vertical slides?

Vertical slides come with several benefits. You need to factor in your audience and the context of the presentation. Here are the top reasons for using a vertical orientation in Google Slides.

  • Better mobile experience: A vertical orientation makes more sense if you plan to view your presentation on the phone. Scrolling on vertical slides feels more intuitive than tapping them.
  • Seamless printing: Since a vertical layout is suitable with most standard paper sizes, you don't need to make any major tweaks to fit the content on paper.
  • Ideal for online presentations: Do you plan to share a presentation with your students or attendees over a video conference? Not everyone has a laptop to view your shared presentation. Use a vertical orientation that's more user-friendly for your audience.
  • Suitable for social media platforms: Go with a portrait ratio if you want to share a presentation during livestreaming on a social media platform like TikTok or YouTube.
  • Novelty factor: Vertical slides add a unique touch to your presentation. When everyone else addresses the audience with the same horizontal slides, a vertical presentation adds a unique touch to your pitch.

Using vertical orientation in a presentation: Our observations

Before you apply a vertical orientation, keep the points below in mind.

  • Google Slides doesn't allow you to mix horizontal and vertical slides. The tweak applies to the entire presentation when you change the page setup.
  • If you use a Google Slides template, adjust your designs accordingly. Most templates are designed for landscape orientation and don't use flexible elements that automatically fit a vertical slide.

What's the difference between Google Slides templates and themes?

Optimize your presentation for mobile convenience.

Whether you use a horizontal or portrait orientation, your presentation must hit the bull's eye to catch your audience's attention. Instead of creating a presentation from scratch and ending up with a bland one, use one of the top Google Slides templates to speed up the process.

Google Slides yellow logo icon printed over blurry background showing presentation, audience, and speaker

IMAGES

  1. PPT

    presentation on google scholar

  2. PPT

    presentation on google scholar

  3. PPT

    presentation on google scholar

  4. PPT

    presentation on google scholar

  5. PPT

    presentation on google scholar

  6. PPT

    presentation on google scholar

VIDEO

  1. Google Scholar and OneSearch

  2. GOOGLE SCHOLAR

  3. Scholar ship presentation

  4. Overview of Using Google Scholar to Search for Academic Journal Articles

  5. Google Scholar Finding Information

  6. Google Scholar Tips & Tricks

COMMENTS

  1. How to Use Google Scholar for Academic Research

    Click the hamburger menu to open the sidebar. Select Alerts to open a new page. Click the red Create alert button and insert the keywords for which Google Scholar should look. Select Update ...

  2. How to use Google Scholar: the ultimate guide

    Google Scholar searches are not case sensitive. 2. Use keywords instead of full sentences. 3. Use quotes to search for an exact match. 3. Add the year to the search phrase to get articles published in a particular year. 4. Use the side bar controls to adjust your search result.

  3. Ten simple rules for effective presentation slides

    The "presentation slide" is the building block of all academic presentations, whether they are journal clubs, thesis committee meetings, short conference talks, or hour-long seminars. ... [Google Scholar] 12. Tufte ER. The Visual Display of Quantitative Information. 2nd ed. Graphics Press; 2001. [Google Scholar] 13.

  4. Google Scholar

    Google Scholar provides a simple way to broadly search for scholarly literature. Search across a wide variety of disciplines and sources: articles, theses, books, abstracts and court opinions.

  5. Seven tips for giving an engaging and memorable presentation

    Tip 2: Tell a story. Stories connect people. A story that is personal to the speaker can evoke memories that are relatable and add concrete meaning to the presentation. 3 Consider starting your presentation with a story that shows why the topic is important to you. In addition, stories focus the audience on the speaker, rather than a slideshow.

  6. Accepted standards on how to give a Medical Research Presentation: a

    Presentation-related terms were searched within the titles of articles listed in PubMed, restricting the search to English-language articles published from January 1975 to July 2015. ... [Google Scholar] 4. Estrada CA, Patel SR, Talente G, Kraemer S. The 10-minute oral presentation: what should I focus on? Am J Med Sci. 2005; 329 (6):306-309 ...

  7. Enhancing learners' awareness of oral presentation (delivery) skills in

    The list of presentation items (i.e. areas to consider when delivering oral presentations) was retrieved and modified from the following: (1) presentation assessment criteria in some journal articles such as Al-Issa and Al-Qubtan (2010), Langan et al. (2005) and Živković (2014), (2) practical advice from websites and chapters from books, most ...

  8. How to prepare and deliver an effective oral presentation

    Delivery. It is important to dress appropriately, stand up straight, and project your voice towards the back of the room. Practise using a microphone, or any other presentation aids, in advance. If you don't have your own presenting style, think of the style of inspirational scientific speakers you have seen and imitate it.

  9. Ten simple rules for effective presentation slides

    The "presentation slide" is the building block of all academic presentations, whether they are journal clubs, thesis committee meetings, short conference talks, or hour-long seminars. ... Google Scholar 2. Bourne PE. Ten simple rules for making good oral presentations. PLoS Comput Biol. 2007;3:593-4. pmid:17500596 . View Article ...

  10. Engaging the Audience: Developing Presentation Skills in Science

    The craft of scientific presentations: critical steps to succeed and critical steps to avoid. New York, NY: Springer-Verlag Inc; 2003. [Google Scholar] Anholt RRH. Dazzle'em with style: the art of oral scientific presentation. 2nd Ed. Boston, MA: Elsevier Academic Press; 2005. [Google Scholar] Kenney P.

  11. How to Prepare and Give a Scholarly Oral Presentation

    To assist the audience, a speaker could start by saying, "Today, I am going to cover three main points.". Then, state what each point is by using transitional words such as "first," "second," and "finally.". For research-focused presentations, the structure following the overview is similar to an academic paper.

  12. How to Prepare and Give a Scholarly Oral Presentation

    To assist the audience, a speaker could start by saying, "Today, I am going to cover three main points.". Then, state what each point is by using transitional words such as "First," "Second," and "Finally.". For research focused presentations, the structure following the overview is similar to an academic paper.

  13. Effective presentation skills

    Avoider —You do everything possible to escape from having to get in front of an audience. Resister —You may have to speak, but you never encourage it. Accepter —You'll give presentations but don't seek those opportunities. Sometimes you feel good about a presentation you gave. Seeker —Looks for opportunities to speak.

  14. Improving Qualitative Research Findings Presentations: Insights From

    Every year thousands of presentations of qualitative research findings are made at conferences, departmental seminars, meetings, and student defenses. ... Google Scholar. Aitchison C., Lee A. (2006). Research writing: Problems and pedagogies. Teaching in Higher Education, 11, 265-278. Crossref. Google Scholar.

  15. Every presentation is a performance

    One of the cornerstones of communication is the ability to connect with—to engage—your audience. When you improve your presentation skills, you improve your ability to engage your audience. I have long maintained that every presentation is a performance. I suppose this comes from all the years I played trombone in nearly every kind of ...

  16. Does a presentation's medium affect its message? PowerPoint ...

    Google Scholar 19. Wilmoth J, Wybraniec J. Profits and pitfalls: Thoughts on using a laptop computer and presentation software to teach introductory social statistics. Teach Sociol. 1998 Jul;26(3):166. View Article Google Scholar 20. Worthington DL, Levasseur DG.

  17. Adding Presentation and data in Google Scholar

    2. Google Scholar usually (not always) just crawl the contents which are organized like a research article. If it's just a presentation (e.g. powerpoint, LaTeX beamer, poster, etc.) you could upload it in F1000Research or Figshare which could give you a DOI and will be fully citable. But it does not guarantee that it will be indexed by Google ...

  18. Presenting With Confidence

    Often, advanced practitioners must give clinical presentations. Public speaking, which is a major fear for most individuals, is a developed skill. Giving an oral presentation is a good way to demonstrate work, knowledge base, and expertise. ... [PMC free article] [Google Scholar] 5. Smith R. How not to give a presentation. BMJ. 2000; 321 (7276 ...

  19. Improving Speaking and Presentation Skills through Interactive

    These presentations served as the database for the pre-test result of the two sections before assigning one section as a control group and the other section as an experiment alone (i.e., 20 participants in each group). ... Google Scholar. Zhou Z. (2018). Second language learning in the technology-mediated environments. Asian Education Studies ...

  20. Google Slides: Online Slideshow Maker

    Use Google Slides to create online slideshows. Make beautiful presentations together with secure sharing in real-time and from any device.

  21. Scientific Presenting: Using Evidence-Based Classroom Practices to

    Presentation Goals: With a manuscript in preparation and his upcoming search for a faculty position, ... [PMC free article] [Google Scholar] Angelo T. A., Cross K. P. (1993). Classroom assessment techniques: A handbook for college teachers. San Francisco: Jossey-Bass. [Google Scholar]

  22. The importance of presentation skills in the classroom: students and

    Employers are demanding graduates with excellent communication (written, oral, and listening) skills. Thus, a student's presentation in the classroom becomes an important element in delivering positive learning experiences. This paper explored the role ...

  23. PPT

    Google Scholar. An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Download presentation by click this link.

  24. Google Slides: How to make a phone-friendly, vertical presentation

    Follow the steps below to use vertical orientation in Google Slides. Navigate to Google Slides on the web and open a presentation you want to edit. Click File at the top and select Page setup ...