Research Workflow and the Scientific Approach

From Curiosity to Credibility: Why the Research Workflow Matters

Imagine scrolling your feed and stumbling into a lively argument under a news post. One person drops a link to an official-looking report, another shares a personal story, and a third posts a meme with a bold claim. Some of it sounds plausible; some feels manipulative. You wonder: How would I know what’s actually true?

That spark of wondering is where research begins. But curiosity alone won’t carry you to a trustworthy conclusion. You need a workflow—a repeatable way to move from “I’m wondering…” to “Here’s my evidence-based answer.” In communication research, that means making your steps visible, testable, and fair so others can check your work and build on it. That visibility is what turns private hunches into credible knowledge.

This chapter introduces the research workflow (the practical steps of a project) and the scientific approach (the guiding ideas that keep our work systematic and open to scrutiny). We’ll start by examining the everyday shortcuts we use to make sense of the world—and why they’re not enough for building shared knowledge.


Everyday Ways of Knowing—and Their Limits

We can’t run a study for every decision, so we rely on mental shortcuts. They’re efficient, but they have limits:

  • Tradition is “what we’ve always done.” It offers stability but resists updating and often lacks evidence.
  • Authority defers to experts. Useful—until expertise is misapplied, biased, or conflicted.
  • Common sense feels “obvious,” yet it’s culture-bound and frequently contradictory.
  • Intuition is fast and sometimes insightful, but it’s shaped by emotion and bias.

Shortcuts are fine for daily life, but they’re unreliable for building knowledge others should trust. Research offers a more disciplined alternative.


The Scientific Approach: A More Reliable Path

The scientific approach is not a lab coat—it’s a mindset for minimizing error and bias. It’s guided by several core principles.

Empiricism

Claims should be grounded in observable, documentable evidence. Instead of debating whether a negative political ad “works,” we measure outcomes (e.g., attitudes, recall, intentions) under defined conditions.

Objectivity

Perfect neutrality is impossible, but procedures (clear protocols, blinding, preregistration) help keep our preferences from steering results. The goal is intersubjectivity: work so transparent that another careful researcher could follow our steps.

Determinism (Probabilistic)

Communication outcomes are not random; they co-vary with message, channel, and audience factors. We expect probabilistic patterns (X raises the likelihood of Y), not guarantees.

Control

To argue that X causes Y, we must rule out alternatives (comparison groups, consistent procedures, manipulation checks). Control is how we move from “they’re related” (correlation) to “X probably produces change in Y” (causation).

Replication

For findings to be trustworthy, they must be repeatable. A single study can be a fluke, a result of random error, or specific to an unmeasured context. Replication, where other researchers in other settings follow the original study’s methods and find similar results, is the ultimate safeguard against drawing premature conclusions. It builds collective confidence in a finding.

These principles drive the scientific method: theory → hypotheses → data → conclusions → theory refinement. But before you can follow the workflow, you need to know what you’re trying to accomplish.

The Goals of Research: Why Are We Doing This?

Not all research asks the same kind of question. Your goal will shape every decision you make. Broadly, research can have one of three primary goals:

  • Exploratory Research: This is about mapping a new or poorly understood territory. The guiding question is often “What is going on here?” Exploratory studies are common when a new technology or social phenomenon emerges. For example, an early study on TikTok might have aimed to simply describe the communication norms and genres developing on the platform. These studies often use qualitative methods to generate rich insights and groundwork for future research.

  • Explanatory Research: This is the most common goal in social science, focused on testing theories and answering “Why?” questions. It seeks to explain the relationships between variables, often by testing specific hypotheses about cause and effect. For instance, an explanatory study might test whether exposure to a specific type of media content causes a change in viewers’ attitudes.

  • Evaluative Research: This is an applied form of research that asks, “Does this work?” It assesses the effectiveness of a specific program, campaign, or intervention. An example would be a study to determine if a non-profit’s media literacy workshop actually improved participants’ ability to identify misinformation.

Knowing your goal—to explore, explain, or evaluate—is the first step in designing a coherent and purposeful study.


The Research Workflow: Five Interconnected Stages

The workflow gives you a practical map. The stages overlap and loop back as you learn.

1) Conceptualization Start broad (“How does misinformation spread?”) and refine to a focused research question using a literature review. You’ll surface what’s known, what’s contested, and where the gaps are (e.g., “Does a platform ‘fact-check’ label reduce sharing intention among 18–24-year-olds?”). Define key constructs you’ll later operationalize.

2) Design Make the blueprint that will answer your question:

  • Methodology: survey, experiment, interview/focus group, (computational) content analysis, ethnography, or a justified mix.
  • Sampling: define the population, pick a sampling frame, and select a sample that fits your question and constraints.
  • Measurement: map constructs to observable indicators (scales, behavioral traces, codes). Decide response options, item wording, and manipulation checks.
  • Ethics: plan informed consent, privacy, data security, and risk mitigation appropriate to your context and population.

3) Data Collection Run the plan. Consistency is everything: follow scripts, track versions, log deviations, and pilot your instruments. Small slips here (question order, inconsistent coder training) ripple into big validity problems later.

4) Data Analysis Connect evidence to your question.

  • Quantitative: examine distributions, relationships, differences, and model parameters (with assumptions and effect sizes).
  • Qualitative: code for themes, patterns, processes, and counterexamples; triangulate across sources and analysts.

5) Communication Share not just what you found, but how you got there. Your audience can be scholarly (conference, journal) or applied (white paper, client deck). Transparent reporting lets others evaluate and extend your work.

Iterative reality: Pilot data can send you back to redesign an item; surprising results can send you back to theory. That’s not failure—it’s how knowledge gets sharper.


Open, Reproducible, and Ethical by Default

Modern research culture favors work that others can inspect, rerun, and reuse. In a media/communication context—where data can be sensitive—“open” also means responsibly open.

Minimal Reproducibility Stack

  • Project structure: keep a consistent skeleton (/data-raw, /data, /scripts, /figs, /doc).
  • Version control: use Git/GitHub for history and collaboration.
  • Computational notes: set a random seed; record session info and package versions; log important decisions.
  • Readme & codebook: document variables, units, coding rules, and any derived fields.

Repro Tip: Save figures and tables from code, not by screenshot. It guarantees you can recreate them when data or fonts change.

Responsible Openness

  • Share what you can: analysis scripts, simulated or de-identified data, and detailed methods.
  • Protect what you must: remove direct identifiers; paraphrase sensitive quotes that could be search-traced; respect platform Terms of Service and community norms.
  • Preregister when appropriate: record your main hypotheses, outcomes, and analysis plan before peeking deeply at data.

IRB Watch-Out: “Public” on the internet is not the same as “free to use without harm.” Consider context, expectations, and the risk of re-identification when quoting posts or linking handles.

FAIR Data (Right-Sized for Class Projects)

FAIR = Findable, Accessible, Interoperable, Reusable. For student work:

  • Use clear file names and structured folders; include a README.md.
  • Store tabular data in CSV with header rows and documented encodings.
  • Define variable names, value labels, and missing codes in a short codebook.

Design Decision: If you can’t legally or ethically share raw data, share a synthetic or toy dataset plus your full analysis notebook so others can still follow your logic.


Micro-Blueprints You Can Reuse

One-Page Data Management Plan (DMP)

  • What: data types (survey responses, transcripts, platform metadata).
  • Where: storage locations (local encrypted drive, institution cloud).
  • Who: access roles (you, teammate, instructor).
  • How long: retention period and deletion plan.
  • Protection: identifiers, de-identification steps, consent terms.
  • Sharing: what will be shared (scripts, codebook, synthetic data) and where (GitHub/OSF repo).

Operationalization Snapshot

  • Construct: “Sharing intention”
  • Indicator: mean of three 7-point items (e.g., “I would share this post with friends”).
  • Timing/Context: immediately after exposure to a post.
  • Quality checks: attention check; reverse-coded item; internal consistency threshold (e.g., α ≥ .70).
  • Decision rule: preregistered exclusion criteria (failed attention check; completion < 60s).

Tool-Agnostic Principles (Why Logic Beats Buttons)

Software evolves; principles endure. Whether you analyze with SPSS, R, or Python, you still need to:

  • Match design to question.
  • Match measurement to construct.
  • Match analysis to design and assumptions.
  • Report with transparency (what you planned, what you did, what changed, and why).

R Quick Win (2–5 minutes): Create a Quarto project with a reproducible scaffold.

  1. Create a new RStudio Project → “New Directory.”
  2. Add folders: data-raw/, data/, scripts/, figs/, doc/.
  3. In scripts/00-setup.R, add:
set.seed(451)
sessionInfo()
  1. Knit a blank index.qmd to confirm your environment is captured.

Conclusion: Research as Disciplined Curiosity

Research channels everyday curiosity into a transparent, cumulative process. The workflow gives you the map; the scientific approach gives you the compass. Layer in reproducibility and ethical reflexivity, and your work becomes not only credible today but also useful tomorrow—for you, your teammates, and anyone who wants to understand or extend what you’ve done.


Journal Prompts

  1. Think about a claim you’ve seen online that you weren’t sure was true. How would the principles of empiricism, control, and replication help you design a study to test whether it was accurate and build confidence in the result?

  2. Choose a topic you’re curious about in media or communication (e.g., the effect of streaming on movie watching, how politicians use TikTok). Frame it as an exploratory, explanatory, and evaluative research question. Pick one question and briefly describe what you would do in each of the five stages of the research workflow to answer it.

  3. Describe a time you learned a digital tool (in any context—school, work, a hobby) without really understanding the reasoning behind what you were doing. How might knowing the “why”—the tool-agnostic principles—have helped you use it more effectively, solve problems, or even choose a better tool for the task?


Quick Reference

  • Five Stages: Conceptualization → Design → Data Collection → Data Analysis → Communication
  • Five Principles: Empiricism • Objectivity • Determinism • Control • Replication
  • Three Goals: Exploratory • Explanatory • Evaluative
  • Always On: Ethics • Reproducibility • Transparency