• AI & NLP
  • Churn & Loyalty
  • Customer Experience
  • Customer Journeys
  • Customer Metrics
  • Feedback Analysis
  • Product Experience
  • Product Updates
  • Sentiment Analysis
  • Surveys & Feedback Collection
  • Try Thematic

Welcome to the community

what is data analysis procedures for qualitative research

Qualitative Data Analysis: Step-by-Step Guide (Manual vs. Automatic)

When we conduct qualitative methods of research, need to explain changes in metrics or understand people's opinions, we always turn to qualitative data. Qualitative data is typically generated through:

  • Interview transcripts
  • Surveys with open-ended questions
  • Contact center transcripts
  • Texts and documents
  • Audio and video recordings
  • Observational notes

Compared to quantitative data, which captures structured information, qualitative data is unstructured and has more depth. It can answer our questions, can help formulate hypotheses and build understanding.

It's important to understand the differences between quantitative data & qualitative data . But unfortunately, analyzing qualitative data is difficult. While tools like Excel, Tableau and PowerBI crunch and visualize quantitative data with ease, there are a limited number of mainstream tools for analyzing qualitative data . The majority of qualitative data analysis still happens manually.

That said, there are two new trends that are changing this. First, there are advances in natural language processing (NLP) which is focused on understanding human language. Second, there is an explosion of user-friendly software designed for both researchers and businesses. Both help automate the qualitative data analysis process.

In this post we want to teach you how to conduct a successful qualitative data analysis. There are two primary qualitative data analysis methods; manual & automatic. We will teach you how to conduct the analysis manually, and also, automatically using software solutions powered by NLP. We’ll guide you through the steps to conduct a manual analysis, and look at what is involved and the role technology can play in automating this process.

More businesses are switching to fully-automated analysis of qualitative customer data because it is cheaper, faster, and just as accurate. Primarily, businesses purchase subscriptions to feedback analytics platforms so that they can understand customer pain points and sentiment.

Overwhelming quantity of feedback

We’ll take you through 5 steps to conduct a successful qualitative data analysis. Within each step we will highlight the key difference between the manual, and automated approach of qualitative researchers. Here's an overview of the steps:

The 5 steps to doing qualitative data analysis

  • Gathering and collecting your qualitative data
  • Organizing and connecting into your qualitative data
  • Coding your qualitative data
  • Analyzing the qualitative data for insights
  • Reporting on the insights derived from your analysis

What is Qualitative Data Analysis?

Qualitative data analysis is a process of gathering, structuring and interpreting qualitative data to understand what it represents.

Qualitative data is non-numerical and unstructured. Qualitative data generally refers to text, such as open-ended responses to survey questions or user interviews, but also includes audio, photos and video.

Businesses often perform qualitative data analysis on customer feedback. And within this context, qualitative data generally refers to verbatim text data collected from sources such as reviews, complaints, chat messages, support centre interactions, customer interviews, case notes or social media comments.

How is qualitative data analysis different from quantitative data analysis?

Understanding the differences between quantitative & qualitative data is important. When it comes to analyzing data, Qualitative Data Analysis serves a very different role to Quantitative Data Analysis. But what sets them apart?

Qualitative Data Analysis dives into the stories hidden in non-numerical data such as interviews, open-ended survey answers, or notes from observations. It uncovers the ‘whys’ and ‘hows’ giving a deep understanding of people’s experiences and emotions.

Quantitative Data Analysis on the other hand deals with numerical data, using statistics to measure differences, identify preferred options, and pinpoint root causes of issues.  It steps back to address questions like "how many" or "what percentage" to offer broad insights we can apply to larger groups.

In short, Qualitative Data Analysis is like a microscope,  helping us understand specific detail. Quantitative Data Analysis is like the telescope, giving us a broader perspective. Both are important, working together to decode data for different objectives.

Qualitative Data Analysis methods

Once all the data has been captured, there are a variety of analysis techniques available and the choice is determined by your specific research objectives and the kind of data you’ve gathered.  Common qualitative data analysis methods include:

Content Analysis

This is a popular approach to qualitative data analysis. Other qualitative analysis techniques may fit within the broad scope of content analysis. Thematic analysis is a part of the content analysis.  Content analysis is used to identify the patterns that emerge from text, by grouping content into words, concepts, and themes. Content analysis is useful to quantify the relationship between all of the grouped content. The Columbia School of Public Health has a detailed breakdown of content analysis .

Narrative Analysis

Narrative analysis focuses on the stories people tell and the language they use to make sense of them.  It is particularly useful in qualitative research methods where customer stories are used to get a deep understanding of customers’ perspectives on a specific issue. A narrative analysis might enable us to summarize the outcomes of a focused case study.

Discourse Analysis

Discourse analysis is used to get a thorough understanding of the political, cultural and power dynamics that exist in specific situations.  The focus of discourse analysis here is on the way people express themselves in different social contexts. Discourse analysis is commonly used by brand strategists who hope to understand why a group of people feel the way they do about a brand or product.

Thematic Analysis

Thematic analysis is used to deduce the meaning behind the words people use. This is accomplished by discovering repeating themes in text. These meaningful themes reveal key insights into data and can be quantified, particularly when paired with sentiment analysis . Often, the outcome of thematic analysis is a code frame that captures themes in terms of codes, also called categories. So the process of thematic analysis is also referred to as “coding”. A common use-case for thematic analysis in companies is analysis of customer feedback.

Grounded Theory

Grounded theory is a useful approach when little is known about a subject. Grounded theory starts by formulating a theory around a single data case. This means that the theory is “grounded”. Grounded theory analysis is based on actual data, and not entirely speculative. Then additional cases can be examined to see if they are relevant and can add to the original grounded theory.

Methods of qualitative data analysis; approaches and techniques to qualitative data analysis

Challenges of Qualitative Data Analysis

While Qualitative Data Analysis offers rich insights, it comes with its challenges. Each unique QDA method has its unique hurdles. Let’s take a look at the challenges researchers and analysts might face, depending on the chosen method.

  • Time and Effort (Narrative Analysis): Narrative analysis, which focuses on personal stories, demands patience. Sifting through lengthy narratives to find meaningful insights can be time-consuming, requires dedicated effort.
  • Being Objective (Grounded Theory): Grounded theory, building theories from data, faces the challenges of personal biases. Staying objective while interpreting data is crucial, ensuring conclusions are rooted in the data itself.
  • Complexity (Thematic Analysis): Thematic analysis involves identifying themes within data, a process that can be intricate. Categorizing and understanding themes can be complex, especially when each piece of data varies in context and structure. Thematic Analysis software can simplify this process.
  • Generalizing Findings (Narrative Analysis): Narrative analysis, dealing with individual stories, makes drawing broad challenging. Extending findings from a single narrative to a broader context requires careful consideration.
  • Managing Data (Thematic Analysis): Thematic analysis involves organizing and managing vast amounts of unstructured data, like interview transcripts. Managing this can be a hefty task, requiring effective data management strategies.
  • Skill Level (Grounded Theory): Grounded theory demands specific skills to build theories from the ground up. Finding or training analysts with these skills poses a challenge, requiring investment in building expertise.

Benefits of qualitative data analysis

Qualitative Data Analysis (QDA) is like a versatile toolkit, offering a tailored approach to understanding your data. The benefits it offers are as diverse as the methods. Let’s explore why choosing the right method matters.

  • Tailored Methods for Specific Needs: QDA isn't one-size-fits-all. Depending on your research objectives and the type of data at hand, different methods offer unique benefits. If you want emotive customer stories, narrative analysis paints a strong picture. When you want to explain a score, thematic analysis reveals insightful patterns
  • Flexibility with Thematic Analysis: thematic analysis is like a chameleon in the toolkit of QDA. It adapts well to different types of data and research objectives, making it a top choice for any qualitative analysis.
  • Deeper Understanding, Better Products: QDA helps you dive into people's thoughts and feelings. This deep understanding helps you build products and services that truly matches what people want, ensuring satisfied customers
  • Finding the Unexpected: Qualitative data often reveals surprises that we miss in quantitative data. QDA offers us new ideas and perspectives, for insights we might otherwise miss.
  • Building Effective Strategies: Insights from QDA are like strategic guides. They help businesses in crafting plans that match people’s desires.
  • Creating Genuine Connections: Understanding people’s experiences lets businesses connect on a real level. This genuine connection helps build trust and loyalty, priceless for any business.

How to do Qualitative Data Analysis: 5 steps

Now we are going to show how you can do your own qualitative data analysis. We will guide you through this process step by step. As mentioned earlier, you will learn how to do qualitative data analysis manually , and also automatically using modern qualitative data and thematic analysis software.

To get best value from the analysis process and research process, it’s important to be super clear about the nature and scope of the question that’s being researched. This will help you select the research collection channels that are most likely to help you answer your question.

Depending on if you are a business looking to understand customer sentiment, or an academic surveying a school, your approach to qualitative data analysis will be unique.

Once you’re clear, there’s a sequence to follow. And, though there are differences in the manual and automatic approaches, the process steps are mostly the same.

The use case for our step-by-step guide is a company looking to collect data (customer feedback data), and analyze the customer feedback - in order to improve customer experience. By analyzing the customer feedback the company derives insights about their business and their customers. You can follow these same steps regardless of the nature of your research. Let’s get started.

Step 1: Gather your qualitative data and conduct research (Conduct qualitative research)

The first step of qualitative research is to do data collection. Put simply, data collection is gathering all of your data for analysis. A common situation is when qualitative data is spread across various sources.

Classic methods of gathering qualitative data

Most companies use traditional methods for gathering qualitative data: conducting interviews with research participants, running surveys, and running focus groups. This data is typically stored in documents, CRMs, databases and knowledge bases. It’s important to examine which data is available and needs to be included in your research project, based on its scope.

Using your existing qualitative feedback

As it becomes easier for customers to engage across a range of different channels, companies are gathering increasingly large amounts of both solicited and unsolicited qualitative feedback.

Most organizations have now invested in Voice of Customer programs , support ticketing systems, chatbot and support conversations, emails and even customer Slack chats.

These new channels provide companies with new ways of getting feedback, and also allow the collection of unstructured feedback data at scale.

The great thing about this data is that it contains a wealth of valubale insights and that it’s already there! When you have a new question about user behavior or your customers, you don’t need to create a new research study or set up a focus group. You can find most answers in the data you already have.

Typically, this data is stored in third-party solutions or a central database, but there are ways to export it or connect to a feedback analysis solution through integrations or an API.

Utilize untapped qualitative data channels

There are many online qualitative data sources you may not have considered. For example, you can find useful qualitative data in social media channels like Twitter or Facebook. Online forums, review sites, and online communities such as Discourse or Reddit also contain valuable data about your customers, or research questions.

If you are considering performing a qualitative benchmark analysis against competitors - the internet is your best friend. Gathering feedback in competitor reviews on sites like Trustpilot, G2, Capterra, Better Business Bureau or on app stores is a great way to perform a competitor benchmark analysis.

Customer feedback analysis software often has integrations into social media and review sites, or you could use a solution like DataMiner to scrape the reviews.

G2.com reviews of the product Airtable. You could pull reviews from G2 for your analysis.

Step 2: Connect & organize all your qualitative data

Now you all have this qualitative data but there’s a problem, the data is unstructured. Before feedback can be analyzed and assigned any value, it needs to be organized in a single place. Why is this important? Consistency!

If all data is easily accessible in one place and analyzed in a consistent manner, you will have an easier time summarizing and making decisions based on this data.

The manual approach to organizing your data

The classic method of structuring qualitative data is to plot all the raw data you’ve gathered into a spreadsheet.

Typically, research and support teams would share large Excel sheets and different business units would make sense of the qualitative feedback data on their own. Each team collects and organizes the data in a way that best suits them, which means the feedback tends to be kept in separate silos.

An alternative and a more robust solution is to store feedback in a central database, like Snowflake or Amazon Redshift .

Keep in mind that when you organize your data in this way, you are often preparing it to be imported into another software. If you go the route of a database, you would need to use an API to push the feedback into a third-party software.

Computer-assisted qualitative data analysis software (CAQDAS)

Traditionally within the manual analysis approach (but not always), qualitative data is imported into CAQDAS software for coding.

In the early 2000s, CAQDAS software was popularised by developers such as ATLAS.ti, NVivo and MAXQDA and eagerly adopted by researchers to assist with the organizing and coding of data.  

The benefits of using computer-assisted qualitative data analysis software:

  • Assists in the organizing of your data
  • Opens you up to exploring different interpretations of your data analysis
  • Allows you to share your dataset easier and allows group collaboration (allows for secondary analysis)

However you still need to code the data, uncover the themes and do the analysis yourself. Therefore it is still a manual approach.

The user interface of CAQDAS software 'NVivo'

Organizing your qualitative data in a feedback repository

Another solution to organizing your qualitative data is to upload it into a feedback repository where it can be unified with your other data , and easily searchable and taggable. There are a number of software solutions that act as a central repository for your qualitative research data. Here are a couple solutions that you could investigate:  

  • Dovetail: Dovetail is a research repository with a focus on video and audio transcriptions. You can tag your transcriptions within the platform for theme analysis. You can also upload your other qualitative data such as research reports, survey responses, support conversations, and customer interviews. Dovetail acts as a single, searchable repository. And makes it easier to collaborate with other people around your qualitative research.
  • EnjoyHQ: EnjoyHQ is another research repository with similar functionality to Dovetail. It boasts a more sophisticated search engine, but it has a higher starting subscription cost.

Organizing your qualitative data in a feedback analytics platform

If you have a lot of qualitative customer or employee feedback, from the likes of customer surveys or employee surveys, you will benefit from a feedback analytics platform. A feedback analytics platform is a software that automates the process of both sentiment analysis and thematic analysis . Companies use the integrations offered by these platforms to directly tap into their qualitative data sources (review sites, social media, survey responses, etc.). The data collected is then organized and analyzed consistently within the platform.

If you have data prepared in a spreadsheet, it can also be imported into feedback analytics platforms.

Once all this rich data has been organized within the feedback analytics platform, it is ready to be coded and themed, within the same platform. Thematic is a feedback analytics platform that offers one of the largest libraries of integrations with qualitative data sources.

Some of qualitative data integrations offered by Thematic

Step 3: Coding your qualitative data

Your feedback data is now organized in one place. Either within your spreadsheet, CAQDAS, feedback repository or within your feedback analytics platform. The next step is to code your feedback data so we can extract meaningful insights in the next step.

Coding is the process of labelling and organizing your data in such a way that you can then identify themes in the data, and the relationships between these themes.

To simplify the coding process, you will take small samples of your customer feedback data, come up with a set of codes, or categories capturing themes, and label each piece of feedback, systematically, for patterns and meaning. Then you will take a larger sample of data, revising and refining the codes for greater accuracy and consistency as you go.

If you choose to use a feedback analytics platform, much of this process will be automated and accomplished for you.

The terms to describe different categories of meaning (‘theme’, ‘code’, ‘tag’, ‘category’ etc) can be confusing as they are often used interchangeably.  For clarity, this article will use the term ‘code’.

To code means to identify key words or phrases and assign them to a category of meaning. “I really hate the customer service of this computer software company” would be coded as “poor customer service”.

How to manually code your qualitative data

  • Decide whether you will use deductive or inductive coding. Deductive coding is when you create a list of predefined codes, and then assign them to the qualitative data. Inductive coding is the opposite of this, you create codes based on the data itself. Codes arise directly from the data and you label them as you go. You need to weigh up the pros and cons of each coding method and select the most appropriate.
  • Read through the feedback data to get a broad sense of what it reveals. Now it’s time to start assigning your first set of codes to statements and sections of text.
  • Keep repeating step 2, adding new codes and revising the code description as often as necessary.  Once it has all been coded, go through everything again, to be sure there are no inconsistencies and that nothing has been overlooked.
  • Create a code frame to group your codes. The coding frame is the organizational structure of all your codes. And there are two commonly used types of coding frames, flat, or hierarchical. A hierarchical code frame will make it easier for you to derive insights from your analysis.
  • Based on the number of times a particular code occurs, you can now see the common themes in your feedback data. This is insightful! If ‘bad customer service’ is a common code, it’s time to take action.

We have a detailed guide dedicated to manually coding your qualitative data .

Example of a hierarchical coding frame in qualitative data analysis

Using software to speed up manual coding of qualitative data

An Excel spreadsheet is still a popular method for coding. But various software solutions can help speed up this process. Here are some examples.

  • CAQDAS / NVivo - CAQDAS software has built-in functionality that allows you to code text within their software. You may find the interface the software offers easier for managing codes than a spreadsheet.
  • Dovetail/EnjoyHQ - You can tag transcripts and other textual data within these solutions. As they are also repositories you may find it simpler to keep the coding in one platform.
  • IBM SPSS - SPSS is a statistical analysis software that may make coding easier than in a spreadsheet.
  • Ascribe - Ascribe’s ‘Coder’ is a coding management system. Its user interface will make it easier for you to manage your codes.

Automating the qualitative coding process using thematic analysis software

In solutions which speed up the manual coding process, you still have to come up with valid codes and often apply codes manually to pieces of feedback. But there are also solutions that automate both the discovery and the application of codes.

Advances in machine learning have now made it possible to read, code and structure qualitative data automatically. This type of automated coding is offered by thematic analysis software .

Automation makes it far simpler and faster to code the feedback and group it into themes. By incorporating natural language processing (NLP) into the software, the AI looks across sentences and phrases to identify common themes meaningful statements. Some automated solutions detect repeating patterns and assign codes to them, others make you train the AI by providing examples. You could say that the AI learns the meaning of the feedback on its own.

Thematic automates the coding of qualitative feedback regardless of source. There’s no need to set up themes or categories in advance. Simply upload your data and wait a few minutes. You can also manually edit the codes to further refine their accuracy.  Experiments conducted indicate that Thematic’s automated coding is just as accurate as manual coding .

Paired with sentiment analysis and advanced text analytics - these automated solutions become powerful for deriving quality business or research insights.

You could also build your own , if you have the resources!

The key benefits of using an automated coding solution

Automated analysis can often be set up fast and there’s the potential to uncover things that would never have been revealed if you had given the software a prescribed list of themes to look for.

Because the model applies a consistent rule to the data, it captures phrases or statements that a human eye might have missed.

Complete and consistent analysis of customer feedback enables more meaningful findings. Leading us into step 4.

Step 4: Analyze your data: Find meaningful insights

Now we are going to analyze our data to find insights. This is where we start to answer our research questions. Keep in mind that step 4 and step 5 (tell the story) have some overlap . This is because creating visualizations is both part of analysis process and reporting.

The task of uncovering insights is to scour through the codes that emerge from the data and draw meaningful correlations from them. It is also about making sure each insight is distinct and has enough data to support it.

Part of the analysis is to establish how much each code relates to different demographics and customer profiles, and identify whether there’s any relationship between these data points.

Manually create sub-codes to improve the quality of insights

If your code frame only has one level, you may find that your codes are too broad to be able to extract meaningful insights. This is where it is valuable to create sub-codes to your primary codes. This process is sometimes referred to as meta coding.

Note: If you take an inductive coding approach, you can create sub-codes as you are reading through your feedback data and coding it.

While time-consuming, this exercise will improve the quality of your analysis. Here is an example of what sub-codes could look like.

Example of sub-codes

You need to carefully read your qualitative data to create quality sub-codes. But as you can see, the depth of analysis is greatly improved. By calculating the frequency of these sub-codes you can get insight into which  customer service problems you can immediately address.

Correlate the frequency of codes to customer segments

Many businesses use customer segmentation . And you may have your own respondent segments that you can apply to your qualitative analysis. Segmentation is the practise of dividing customers or research respondents into subgroups.

Segments can be based on:

  • Demographic
  • And any other data type that you care to segment by

It is particularly useful to see the occurrence of codes within your segments. If one of your customer segments is considered unimportant to your business, but they are the cause of nearly all customer service complaints, it may be in your best interest to focus attention elsewhere. This is a useful insight!

Manually visualizing coded qualitative data

There are formulas you can use to visualize key insights in your data. The formulas we will suggest are imperative if you are measuring a score alongside your feedback.

If you are collecting a metric alongside your qualitative data this is a key visualization. Impact answers the question: “What’s the impact of a code on my overall score?”. Using Net Promoter Score (NPS) as an example, first you need to:

  • Calculate overall NPS
  • Calculate NPS in the subset of responses that do not contain that theme
  • Subtract B from A

Then you can use this simple formula to calculate code impact on NPS .

Visualizing qualitative data: Calculating the impact of a code on your score

You can then visualize this data using a bar chart.

You can download our CX toolkit - it includes a template to recreate this.

Trends over time

This analysis can help you answer questions like: “Which codes are linked to decreases or increases in my score over time?”

We need to compare two sequences of numbers: NPS over time and code frequency over time . Using Excel, calculate the correlation between the two sequences, which can be either positive (the more codes the higher the NPS, see picture below), or negative (the more codes the lower the NPS).

Now you need to plot code frequency against the absolute value of code correlation with NPS. Here is the formula:

Analyzing qualitative data: Calculate which codes are linked to increases or decreases in my score

The visualization could look like this:

Visualizing qualitative data trends over time

These are two examples, but there are more. For a third manual formula, and to learn why word clouds are not an insightful form of analysis, read our visualizations article .

Using a text analytics solution to automate analysis

Automated text analytics solutions enable codes and sub-codes to be pulled out of the data automatically. This makes it far faster and easier to identify what’s driving negative or positive results. And to pick up emerging trends and find all manner of rich insights in the data.

Another benefit of AI-driven text analytics software is its built-in capability for sentiment analysis, which provides the emotive context behind your feedback and other qualitative textual data therein.

Thematic provides text analytics that goes further by allowing users to apply their expertise on business context to edit or augment the AI-generated outputs.

Since the move away from manual research is generally about reducing the human element, adding human input to the technology might sound counter-intuitive. However, this is mostly to make sure important business nuances in the feedback aren’t missed during coding. The result is a higher accuracy of analysis. This is sometimes referred to as augmented intelligence .

Codes displayed by volume within Thematic. You can 'manage themes' to introduce human input.

Step 5: Report on your data: Tell the story

The last step of analyzing your qualitative data is to report on it, to tell the story. At this point, the codes are fully developed and the focus is on communicating the narrative to the audience.

A coherent outline of the qualitative research, the findings and the insights is vital for stakeholders to discuss and debate before they can devise a meaningful course of action.

Creating graphs and reporting in Powerpoint

Typically, qualitative researchers take the tried and tested approach of distilling their report into a series of charts, tables and other visuals which are woven into a narrative for presentation in Powerpoint.

Using visualization software for reporting

With data transformation and APIs, the analyzed data can be shared with data visualisation software, such as Power BI or Tableau , Google Studio or Looker. Power BI and Tableau are among the most preferred options.

Visualizing your insights inside a feedback analytics platform

Feedback analytics platforms, like Thematic, incorporate visualisation tools that intuitively turn key data and insights into graphs.  This removes the time consuming work of constructing charts to visually identify patterns and creates more time to focus on building a compelling narrative that highlights the insights, in bite-size chunks, for executive teams to review.

Using a feedback analytics platform with visualization tools means you don’t have to use a separate product for visualizations. You can export graphs into Powerpoints straight from the platforms.

Two examples of qualitative data visualizations within Thematic

Conclusion - Manual or Automated?

There are those who remain deeply invested in the manual approach - because it’s familiar, because they’re reluctant to spend money and time learning new software, or because they’ve been burned by the overpromises of AI.  

For projects that involve small datasets, manual analysis makes sense. For example, if the objective is simply to quantify a simple question like “Do customers prefer X concepts to Y?”. If the findings are being extracted from a small set of focus groups and interviews, sometimes it’s easier to just read them

However, as new generations come into the workplace, it’s technology-driven solutions that feel more comfortable and practical. And the merits are undeniable.  Especially if the objective is to go deeper and understand the ‘why’ behind customers’ preference for X or Y. And even more especially if time and money are considerations.

The ability to collect a free flow of qualitative feedback data at the same time as the metric means AI can cost-effectively scan, crunch, score and analyze a ton of feedback from one system in one go. And time-intensive processes like focus groups, or coding, that used to take weeks, can now be completed in a matter of hours or days.

But aside from the ever-present business case to speed things up and keep costs down, there are also powerful research imperatives for automated analysis of qualitative data: namely, accuracy and consistency.

Finding insights hidden in feedback requires consistency, especially in coding.  Not to mention catching all the ‘unknown unknowns’ that can skew research findings and steering clear of cognitive bias.

Some say without manual data analysis researchers won’t get an accurate “feel” for the insights. However, the larger data sets are, the harder it is to sort through the feedback and organize feedback that has been pulled from different places.  And, the more difficult it is to stay on course, the greater the risk of drawing incorrect, or incomplete, conclusions grows.

Though the process steps for qualitative data analysis have remained pretty much unchanged since psychologist Paul Felix Lazarsfeld paved the path a hundred years ago, the impact digital technology has had on types of qualitative feedback data and the approach to the analysis are profound.  

If you want to try an automated feedback analysis solution on your own qualitative data, you can get started with Thematic .

what is data analysis procedures for qualitative research

Community & Marketing

Tyler manages our community of CX, insights & analytics professionals. Tyler's goal is to help unite insights professionals around common challenges.

We make it easy to discover the customer and product issues that matter.

Unlock the value of feedback at scale, in one platform. Try it for free now!

  • Questions to ask your Feedback Analytics vendor
  • How to end customer churn for good
  • Scalable analysis of NPS verbatims
  • 5 Text analytics approaches
  • How to calculate the ROI of CX

Our experts will show you how Thematic works, how to discover pain points and track the ROI of decisions. To access your free trial, book a personal demo today.

Recent posts

Watercare is New Zealand's largest water and wastewater service provider. They are responsible for bringing clean water to 1.7 million people in Tamaki Makaurau (Auckland) and safeguarding the wastewater network to minimize impact on the environment. Water is a sector that often gets taken for granted, with drainage and

Become a qualitative theming pro! Creating a perfect code frame is hard, but thematic analysis software makes the process much easier.

Qualtrics is one of the most well-known and powerful Customer Feedback Management platforms. But even so, it has limitations. We recently hosted a live panel where data analysts from two well-known brands shared their experiences with Qualtrics, and how they extended this platform’s capabilities. Below, we’ll share the

Grad Coach

Qualitative Data Analysis Methods 101:

The “big 6” methods + examples.

By: Kerryn Warren (PhD) | Reviewed By: Eunice Rautenbach (D.Tech) | May 2020 (Updated April 2023)

Qualitative data analysis methods. Wow, that’s a mouthful. 

If you’re new to the world of research, qualitative data analysis can look rather intimidating. So much bulky terminology and so many abstract, fluffy concepts. It certainly can be a minefield!

Don’t worry – in this post, we’ll unpack the most popular analysis methods , one at a time, so that you can approach your analysis with confidence and competence – whether that’s for a dissertation, thesis or really any kind of research project.

Qualitative data analysis methods

What (exactly) is qualitative data analysis?

To understand qualitative data analysis, we need to first understand qualitative data – so let’s step back and ask the question, “what exactly is qualitative data?”.

Qualitative data refers to pretty much any data that’s “not numbers” . In other words, it’s not the stuff you measure using a fixed scale or complex equipment, nor do you analyse it using complex statistics or mathematics.

So, if it’s not numbers, what is it?

Words, you guessed? Well… sometimes , yes. Qualitative data can, and often does, take the form of interview transcripts, documents and open-ended survey responses – but it can also involve the interpretation of images and videos. In other words, qualitative isn’t just limited to text-based data.

So, how’s that different from quantitative data, you ask?

Simply put, qualitative research focuses on words, descriptions, concepts or ideas – while quantitative research focuses on numbers and statistics . Qualitative research investigates the “softer side” of things to explore and describe , while quantitative research focuses on the “hard numbers”, to measure differences between variables and the relationships between them. If you’re keen to learn more about the differences between qual and quant, we’ve got a detailed post over here .

qualitative data analysis vs quantitative data analysis

So, qualitative analysis is easier than quantitative, right?

Not quite. In many ways, qualitative data can be challenging and time-consuming to analyse and interpret. At the end of your data collection phase (which itself takes a lot of time), you’ll likely have many pages of text-based data or hours upon hours of audio to work through. You might also have subtle nuances of interactions or discussions that have danced around in your mind, or that you scribbled down in messy field notes. All of this needs to work its way into your analysis.

Making sense of all of this is no small task and you shouldn’t underestimate it. Long story short – qualitative analysis can be a lot of work! Of course, quantitative analysis is no piece of cake either, but it’s important to recognise that qualitative analysis still requires a significant investment in terms of time and effort.

Need a helping hand?

what is data analysis procedures for qualitative research

In this post, we’ll explore qualitative data analysis by looking at some of the most common analysis methods we encounter. We’re not going to cover every possible qualitative method and we’re not going to go into heavy detail – we’re just going to give you the big picture. That said, we will of course includes links to loads of extra resources so that you can learn more about whichever analysis method interests you.

Without further delay, let’s get into it.

The “Big 6” Qualitative Analysis Methods 

There are many different types of qualitative data analysis, all of which serve different purposes and have unique strengths and weaknesses . We’ll start by outlining the analysis methods and then we’ll dive into the details for each.

The 6 most popular methods (or at least the ones we see at Grad Coach) are:

  • Content analysis
  • Narrative analysis
  • Discourse analysis
  • Thematic analysis
  • Grounded theory (GT)
  • Interpretive phenomenological analysis (IPA)

Let’s take a look at each of them…

QDA Method #1: Qualitative Content Analysis

Content analysis is possibly the most common and straightforward QDA method. At the simplest level, content analysis is used to evaluate patterns within a piece of content (for example, words, phrases or images) or across multiple pieces of content or sources of communication. For example, a collection of newspaper articles or political speeches.

With content analysis, you could, for instance, identify the frequency with which an idea is shared or spoken about – like the number of times a Kardashian is mentioned on Twitter. Or you could identify patterns of deeper underlying interpretations – for instance, by identifying phrases or words in tourist pamphlets that highlight India as an ancient country.

Because content analysis can be used in such a wide variety of ways, it’s important to go into your analysis with a very specific question and goal, or you’ll get lost in the fog. With content analysis, you’ll group large amounts of text into codes , summarise these into categories, and possibly even tabulate the data to calculate the frequency of certain concepts or variables. Because of this, content analysis provides a small splash of quantitative thinking within a qualitative method.

Naturally, while content analysis is widely useful, it’s not without its drawbacks . One of the main issues with content analysis is that it can be very time-consuming , as it requires lots of reading and re-reading of the texts. Also, because of its multidimensional focus on both qualitative and quantitative aspects, it is sometimes accused of losing important nuances in communication.

Content analysis also tends to concentrate on a very specific timeline and doesn’t take into account what happened before or after that timeline. This isn’t necessarily a bad thing though – just something to be aware of. So, keep these factors in mind if you’re considering content analysis. Every analysis method has its limitations , so don’t be put off by these – just be aware of them ! If you’re interested in learning more about content analysis, the video below provides a good starting point.

QDA Method #2: Narrative Analysis 

As the name suggests, narrative analysis is all about listening to people telling stories and analysing what that means . Since stories serve a functional purpose of helping us make sense of the world, we can gain insights into the ways that people deal with and make sense of reality by analysing their stories and the ways they’re told.

You could, for example, use narrative analysis to explore whether how something is being said is important. For instance, the narrative of a prisoner trying to justify their crime could provide insight into their view of the world and the justice system. Similarly, analysing the ways entrepreneurs talk about the struggles in their careers or cancer patients telling stories of hope could provide powerful insights into their mindsets and perspectives . Simply put, narrative analysis is about paying attention to the stories that people tell – and more importantly, the way they tell them.

Of course, the narrative approach has its weaknesses , too. Sample sizes are generally quite small due to the time-consuming process of capturing narratives. Because of this, along with the multitude of social and lifestyle factors which can influence a subject, narrative analysis can be quite difficult to reproduce in subsequent research. This means that it’s difficult to test the findings of some of this research.

Similarly, researcher bias can have a strong influence on the results here, so you need to be particularly careful about the potential biases you can bring into your analysis when using this method. Nevertheless, narrative analysis is still a very useful qualitative analysis method – just keep these limitations in mind and be careful not to draw broad conclusions . If you’re keen to learn more about narrative analysis, the video below provides a great introduction to this qualitative analysis method.

QDA Method #3: Discourse Analysis 

Discourse is simply a fancy word for written or spoken language or debate . So, discourse analysis is all about analysing language within its social context. In other words, analysing language – such as a conversation, a speech, etc – within the culture and society it takes place. For example, you could analyse how a janitor speaks to a CEO, or how politicians speak about terrorism.

To truly understand these conversations or speeches, the culture and history of those involved in the communication are important factors to consider. For example, a janitor might speak more casually with a CEO in a company that emphasises equality among workers. Similarly, a politician might speak more about terrorism if there was a recent terrorist incident in the country.

So, as you can see, by using discourse analysis, you can identify how culture , history or power dynamics (to name a few) have an effect on the way concepts are spoken about. So, if your research aims and objectives involve understanding culture or power dynamics, discourse analysis can be a powerful method.

Because there are many social influences in terms of how we speak to each other, the potential use of discourse analysis is vast . Of course, this also means it’s important to have a very specific research question (or questions) in mind when analysing your data and looking for patterns and themes, or you might land up going down a winding rabbit hole.

Discourse analysis can also be very time-consuming  as you need to sample the data to the point of saturation – in other words, until no new information and insights emerge. But this is, of course, part of what makes discourse analysis such a powerful technique. So, keep these factors in mind when considering this QDA method. Again, if you’re keen to learn more, the video below presents a good starting point.

QDA Method #4: Thematic Analysis

Thematic analysis looks at patterns of meaning in a data set – for example, a set of interviews or focus group transcripts. But what exactly does that… mean? Well, a thematic analysis takes bodies of data (which are often quite large) and groups them according to similarities – in other words, themes . These themes help us make sense of the content and derive meaning from it.

Let’s take a look at an example.

With thematic analysis, you could analyse 100 online reviews of a popular sushi restaurant to find out what patrons think about the place. By reviewing the data, you would then identify the themes that crop up repeatedly within the data – for example, “fresh ingredients” or “friendly wait staff”.

So, as you can see, thematic analysis can be pretty useful for finding out about people’s experiences , views, and opinions . Therefore, if your research aims and objectives involve understanding people’s experience or view of something, thematic analysis can be a great choice.

Since thematic analysis is a bit of an exploratory process, it’s not unusual for your research questions to develop , or even change as you progress through the analysis. While this is somewhat natural in exploratory research, it can also be seen as a disadvantage as it means that data needs to be re-reviewed each time a research question is adjusted. In other words, thematic analysis can be quite time-consuming – but for a good reason. So, keep this in mind if you choose to use thematic analysis for your project and budget extra time for unexpected adjustments.

Thematic analysis takes bodies of data and groups them according to similarities (themes), which help us make sense of the content.

QDA Method #5: Grounded theory (GT) 

Grounded theory is a powerful qualitative analysis method where the intention is to create a new theory (or theories) using the data at hand, through a series of “ tests ” and “ revisions ”. Strictly speaking, GT is more a research design type than an analysis method, but we’ve included it here as it’s often referred to as a method.

What’s most important with grounded theory is that you go into the analysis with an open mind and let the data speak for itself – rather than dragging existing hypotheses or theories into your analysis. In other words, your analysis must develop from the ground up (hence the name). 

Let’s look at an example of GT in action.

Assume you’re interested in developing a theory about what factors influence students to watch a YouTube video about qualitative analysis. Using Grounded theory , you’d start with this general overarching question about the given population (i.e., graduate students). First, you’d approach a small sample – for example, five graduate students in a department at a university. Ideally, this sample would be reasonably representative of the broader population. You’d interview these students to identify what factors lead them to watch the video.

After analysing the interview data, a general pattern could emerge. For example, you might notice that graduate students are more likely to read a post about qualitative methods if they are just starting on their dissertation journey, or if they have an upcoming test about research methods.

From here, you’ll look for another small sample – for example, five more graduate students in a different department – and see whether this pattern holds true for them. If not, you’ll look for commonalities and adapt your theory accordingly. As this process continues, the theory would develop . As we mentioned earlier, what’s important with grounded theory is that the theory develops from the data – not from some preconceived idea.

So, what are the drawbacks of grounded theory? Well, some argue that there’s a tricky circularity to grounded theory. For it to work, in principle, you should know as little as possible regarding the research question and population, so that you reduce the bias in your interpretation. However, in many circumstances, it’s also thought to be unwise to approach a research question without knowledge of the current literature . In other words, it’s a bit of a “chicken or the egg” situation.

Regardless, grounded theory remains a popular (and powerful) option. Naturally, it’s a very useful method when you’re researching a topic that is completely new or has very little existing research about it, as it allows you to start from scratch and work your way from the ground up .

Grounded theory is used to create a new theory (or theories) by using the data at hand, as opposed to existing theories and frameworks.

QDA Method #6:   Interpretive Phenomenological Analysis (IPA)

Interpretive. Phenomenological. Analysis. IPA . Try saying that three times fast…

Let’s just stick with IPA, okay?

IPA is designed to help you understand the personal experiences of a subject (for example, a person or group of people) concerning a major life event, an experience or a situation . This event or experience is the “phenomenon” that makes up the “P” in IPA. Such phenomena may range from relatively common events – such as motherhood, or being involved in a car accident – to those which are extremely rare – for example, someone’s personal experience in a refugee camp. So, IPA is a great choice if your research involves analysing people’s personal experiences of something that happened to them.

It’s important to remember that IPA is subject – centred . In other words, it’s focused on the experiencer . This means that, while you’ll likely use a coding system to identify commonalities, it’s important not to lose the depth of experience or meaning by trying to reduce everything to codes. Also, keep in mind that since your sample size will generally be very small with IPA, you often won’t be able to draw broad conclusions about the generalisability of your findings. But that’s okay as long as it aligns with your research aims and objectives.

Another thing to be aware of with IPA is personal bias . While researcher bias can creep into all forms of research, self-awareness is critically important with IPA, as it can have a major impact on the results. For example, a researcher who was a victim of a crime himself could insert his own feelings of frustration and anger into the way he interprets the experience of someone who was kidnapped. So, if you’re going to undertake IPA, you need to be very self-aware or you could muddy the analysis.

IPA can help you understand the personal experiences of a person or group concerning a major life event, an experience or a situation.

How to choose the right analysis method

In light of all of the qualitative analysis methods we’ve covered so far, you’re probably asking yourself the question, “ How do I choose the right one? ”

Much like all the other methodological decisions you’ll need to make, selecting the right qualitative analysis method largely depends on your research aims, objectives and questions . In other words, the best tool for the job depends on what you’re trying to build. For example:

  • Perhaps your research aims to analyse the use of words and what they reveal about the intention of the storyteller and the cultural context of the time.
  • Perhaps your research aims to develop an understanding of the unique personal experiences of people that have experienced a certain event, or
  • Perhaps your research aims to develop insight regarding the influence of a certain culture on its members.

As you can probably see, each of these research aims are distinctly different , and therefore different analysis methods would be suitable for each one. For example, narrative analysis would likely be a good option for the first aim, while grounded theory wouldn’t be as relevant. 

It’s also important to remember that each method has its own set of strengths, weaknesses and general limitations. No single analysis method is perfect . So, depending on the nature of your research, it may make sense to adopt more than one method (this is called triangulation ). Keep in mind though that this will of course be quite time-consuming.

As we’ve seen, all of the qualitative analysis methods we’ve discussed make use of coding and theme-generating techniques, but the intent and approach of each analysis method differ quite substantially. So, it’s very important to come into your research with a clear intention before you decide which analysis method (or methods) to use.

Start by reviewing your research aims , objectives and research questions to assess what exactly you’re trying to find out – then select a qualitative analysis method that fits. Never pick a method just because you like it or have experience using it – your analysis method (or methods) must align with your broader research aims and objectives.

No single analysis method is perfect, so it can often make sense to adopt more than one  method (this is called triangulation).

Let’s recap on QDA methods…

In this post, we looked at six popular qualitative data analysis methods:

  • First, we looked at content analysis , a straightforward method that blends a little bit of quant into a primarily qualitative analysis.
  • Then we looked at narrative analysis , which is about analysing how stories are told.
  • Next up was discourse analysis – which is about analysing conversations and interactions.
  • Then we moved on to thematic analysis – which is about identifying themes and patterns.
  • From there, we went south with grounded theory – which is about starting from scratch with a specific question and using the data alone to build a theory in response to that question.
  • And finally, we looked at IPA – which is about understanding people’s unique experiences of a phenomenon.

Of course, these aren’t the only options when it comes to qualitative data analysis, but they’re a great starting point if you’re dipping your toes into qualitative research for the first time.

If you’re still feeling a bit confused, consider our private coaching service , where we hold your hand through the research process to help you develop your best work.

what is data analysis procedures for qualitative research

Psst... there’s more!

This post was based on one of our popular Research Bootcamps . If you're working on a research project, you'll definitely want to check this out ...

You Might Also Like:

Sampling methods and strategies in research

84 Comments

Richard N

This has been very helpful. Thank you.

netaji

Thank you madam,

Mariam Jaiyeola

Thank you so much for this information

Nzube

I wonder it so clear for understand and good for me. can I ask additional query?

Lee

Very insightful and useful

Susan Nakaweesi

Good work done with clear explanations. Thank you.

Titilayo

Thanks so much for the write-up, it’s really good.

Hemantha Gunasekara

Thanks madam . It is very important .

Gumathandra

thank you very good

Pramod Bahulekar

This has been very well explained in simple language . It is useful even for a new researcher.

Derek Jansen

Great to hear that. Good luck with your qualitative data analysis, Pramod!

Adam Zahir

This is very useful information. And it was very a clear language structured presentation. Thanks a lot.

Golit,F.

Thank you so much.

Emmanuel

very informative sequential presentation

Shahzada

Precise explanation of method.

Alyssa

Hi, may we use 2 data analysis methods in our qualitative research?

Thanks for your comment. Most commonly, one would use one type of analysis method, but it depends on your research aims and objectives.

Dr. Manju Pandey

You explained it in very simple language, everyone can understand it. Thanks so much.

Phillip

Thank you very much, this is very helpful. It has been explained in a very simple manner that even a layman understands

Anne

Thank nicely explained can I ask is Qualitative content analysis the same as thematic analysis?

Thanks for your comment. No, QCA and thematic are two different types of analysis. This article might help clarify – https://onlinelibrary.wiley.com/doi/10.1111/nhs.12048

Rev. Osadare K . J

This is my first time to come across a well explained data analysis. so helpful.

Tina King

I have thoroughly enjoyed your explanation of the six qualitative analysis methods. This is very helpful. Thank you!

Bromie

Thank you very much, this is well explained and useful

udayangani

i need a citation of your book.

khutsafalo

Thanks a lot , remarkable indeed, enlighting to the best

jas

Hi Derek, What other theories/methods would you recommend when the data is a whole speech?

M

Keep writing useful artikel.

Adane

It is important concept about QDA and also the way to express is easily understandable, so thanks for all.

Carl Benecke

Thank you, this is well explained and very useful.

Ngwisa

Very helpful .Thanks.

Hajra Aman

Hi there! Very well explained. Simple but very useful style of writing. Please provide the citation of the text. warm regards

Hillary Mophethe

The session was very helpful and insightful. Thank you

This was very helpful and insightful. Easy to read and understand

Catherine

As a professional academic writer, this has been so informative and educative. Keep up the good work Grad Coach you are unmatched with quality content for sure.

Keep up the good work Grad Coach you are unmatched with quality content for sure.

Abdulkerim

Its Great and help me the most. A Million Thanks you Dr.

Emanuela

It is a very nice work

Noble Naade

Very insightful. Please, which of this approach could be used for a research that one is trying to elicit students’ misconceptions in a particular concept ?

Karen

This is Amazing and well explained, thanks

amirhossein

great overview

Tebogo

What do we call a research data analysis method that one use to advise or determining the best accounting tool or techniques that should be adopted in a company.

Catherine Shimechero

Informative video, explained in a clear and simple way. Kudos

Van Hmung

Waoo! I have chosen method wrong for my data analysis. But I can revise my work according to this guide. Thank you so much for this helpful lecture.

BRIAN ONYANGO MWAGA

This has been very helpful. It gave me a good view of my research objectives and how to choose the best method. Thematic analysis it is.

Livhuwani Reineth

Very helpful indeed. Thanku so much for the insight.

Storm Erlank

This was incredibly helpful.

Jack Kanas

Very helpful.

catherine

very educative

Wan Roslina

Nicely written especially for novice academic researchers like me! Thank you.

Talash

choosing a right method for a paper is always a hard job for a student, this is a useful information, but it would be more useful personally for me, if the author provide me with a little bit more information about the data analysis techniques in type of explanatory research. Can we use qualitative content analysis technique for explanatory research ? or what is the suitable data analysis method for explanatory research in social studies?

ramesh

that was very helpful for me. because these details are so important to my research. thank you very much

Kumsa Desisa

I learnt a lot. Thank you

Tesfa NT

Relevant and Informative, thanks !

norma

Well-planned and organized, thanks much! 🙂

Dr. Jacob Lubuva

I have reviewed qualitative data analysis in a simplest way possible. The content will highly be useful for developing my book on qualitative data analysis methods. Cheers!

Nyi Nyi Lwin

Clear explanation on qualitative and how about Case study

Ogobuchi Otuu

This was helpful. Thank you

Alicia

This was really of great assistance, it was just the right information needed. Explanation very clear and follow.

Wow, Thanks for making my life easy

C. U

This was helpful thanks .

Dr. Alina Atif

Very helpful…. clear and written in an easily understandable manner. Thank you.

Herb

This was so helpful as it was easy to understand. I’m a new to research thank you so much.

cissy

so educative…. but Ijust want to know which method is coding of the qualitative or tallying done?

Ayo

Thank you for the great content, I have learnt a lot. So helpful

Tesfaye

precise and clear presentation with simple language and thank you for that.

nneheng

very informative content, thank you.

Oscar Kuebutornye

You guys are amazing on YouTube on this platform. Your teachings are great, educative, and informative. kudos!

NG

Brilliant Delivery. You made a complex subject seem so easy. Well done.

Ankit Kumar

Beautifully explained.

Thanks a lot

Kidada Owen-Browne

Is there a video the captures the practical process of coding using automated applications?

Thanks for the comment. We don’t recommend using automated applications for coding, as they are not sufficiently accurate in our experience.

Mathewos Damtew

content analysis can be qualitative research?

Hend

THANK YOU VERY MUCH.

Dev get

Thank you very much for such a wonderful content

Kassahun Aman

do you have any material on Data collection

Prince .S. mpofu

What a powerful explanation of the QDA methods. Thank you.

Kassahun

Great explanation both written and Video. i have been using of it on a day to day working of my thesis project in accounting and finance. Thank you very much for your support.

BORA SAMWELI MATUTULI

very helpful, thank you so much

Submit a Comment Cancel reply

Your email address will not be published. Required fields are marked *

Save my name, email, and website in this browser for the next time I comment.

  • Print Friendly

what is data analysis procedures for qualitative research

The Ultimate Guide to Qualitative Research - Part 2: Handling Qualitative Data

what is data analysis procedures for qualitative research

  • Handling qualitative data
  • Transcripts
  • Field notes
  • Survey data and responses
  • Visual and audio data
  • Data organization
  • Data coding
  • Coding frame
  • Auto and smart coding
  • Organizing codes
  • Introduction

What is qualitative data analysis?

Qualitative data analysis methods, how do you analyze qualitative data, content analysis, thematic analysis.

  • Thematic analysis vs. content analysis
  • Narrative research

Phenomenological research

Discourse analysis, grounded theory.

  • Deductive reasoning
  • Inductive reasoning
  • Inductive vs. deductive reasoning
  • Qualitative data interpretation
  • Qualitative analysis software

Qualitative data analysis

Analyzing qualitative data is the next step after you have completed the use of qualitative data collection methods . The qualitative analysis process aims to identify themes and patterns that emerge across the data.

what is data analysis procedures for qualitative research

In simplified terms, qualitative research methods involve non-numerical data collection followed by an explanation based on the attributes of the data . For example, if you are asked to explain in qualitative terms a thermal image displayed in multiple colors, then you would explain the color differences rather than the heat's numerical value. If you have a large amount of data (e.g., of group discussions or observations of real-life situations), the next step is to transcribe and prepare the raw data for subsequent analysis.

Researchers can conduct studies fully based on qualitative methodology, or researchers can preface a quantitative research study with a qualitative study to identify issues that were not originally envisioned but are important to the study. Quantitative researchers may also collect and analyze qualitative data following their quantitative analyses to better understand the meanings behind their statistical results.

Conducting qualitative research can especially help build an understanding of how and why certain outcomes were achieved (in addition to what was achieved). For example, qualitative data analysis is often used for policy and program evaluation research since it can answer certain important questions more efficiently and effectively than quantitative approaches.

what is data analysis procedures for qualitative research

Qualitative data analysis can also answer important questions about the relevance, unintended effects, and impact of programs, such as:

  • Were expectations reasonable?
  • Did processes operate as expected?
  • Were key players able to carry out their duties?
  • Were there any unintended effects of the program?

The importance of qualitative data analysis

Qualitative approaches have the advantage of allowing for more diversity in responses and the capacity to adapt to new developments or issues during the research process itself. While qualitative analysis of data can be demanding and time-consuming to conduct, many fields of research utilize qualitative software tools that have been specifically developed to provide more succinct, cost-efficient, and timely results.

what is data analysis procedures for qualitative research

Qualitative data analysis is an important part of research and building greater understanding across fields for a number of reasons. First, cases for qualitative data analysis can be selected purposefully according to whether they typify certain characteristics or contextual locations. In other words, qualitative data permits deep immersion into a topic, phenomenon, or area of interest. Rather than seeking generalizability to the population the sample of participants represent, qualitative research aims to construct an in-depth and nuanced understanding of the research topic.

Secondly, the role or position of the researcher in qualitative analysis of data is given greater critical attention. This is because, in qualitative data analysis, the possibility of the researcher taking a ‘neutral' or transcendent position is seen as more problematic in practical and/or philosophical terms. Hence, qualitative researchers are often exhorted to reflect on their role in the research process and make this clear in the analysis.

what is data analysis procedures for qualitative research

Thirdly, while qualitative data analysis can take a wide variety of forms, it largely differs from quantitative research in the focus on language, signs, experiences, and meaning. In addition, qualitative approaches to analysis are often holistic and contextual rather than analyzing the data in a piecemeal fashion or removing the data from its context. Qualitative approaches thus allow researchers to explore inquiries from directions that could not be accessed with only numerical quantitative data.

Establishing research rigor

Systematic and transparent approaches to the analysis of qualitative data are essential for rigor . For example, many qualitative research methods require researchers to carefully code data and discern and document themes in a consistent and credible way.

what is data analysis procedures for qualitative research

Perhaps the most traditional division in the way qualitative and quantitative research have been used in the social sciences is for qualitative methods to be used for exploratory purposes (e.g., to generate new theory or propositions) or to explain puzzling quantitative results, while quantitative methods are used to test hypotheses .

what is data analysis procedures for qualitative research

After you’ve collected relevant data , what is the best way to look at your data ? As always, it will depend on your research question . For instance, if you employed an observational research method to learn about a group’s shared practices, an ethnographic approach could be appropriate to explain the various dimensions of culture. If you collected textual data to understand how people talk about something, then a discourse analysis approach might help you generate key insights about language and communication.

what is data analysis procedures for qualitative research

The qualitative data coding process involves iterative categorization and recategorization, ensuring the evolution of the analysis to best represent the data. The procedure typically concludes with the interpretation of patterns and trends identified through the coding process.

To start off, let’s look at two broad approaches to data analysis.

Deductive analysis

Deductive analysis is guided by pre-existing theories or ideas. It starts with a theoretical framework , which is then used to code the data. The researcher can thus use this theoretical framework to interpret their data and answer their research question .

The key steps include coding the data based on the predetermined concepts or categories and using the theory to guide the interpretation of patterns among the codings. Deductive analysis is particularly useful when researchers aim to verify or extend an existing theory within a new context.

Inductive analysis

Inductive analysis involves the generation of new theories or ideas based on the data. The process starts without any preconceived theories or codes, and patterns, themes, and categories emerge out of the data.

what is data analysis procedures for qualitative research

The researcher codes the data to capture any concepts or patterns that seem interesting or important to the research question . These codes are then compared and linked, leading to the formation of broader categories or themes. The main goal of inductive analysis is to allow the data to 'speak for itself' rather than imposing pre-existing expectations or ideas onto the data.

Deductive and inductive approaches can be seen as sitting on opposite poles, and all research falls somewhere within that spectrum. Most often, qualitative analysis approaches blend both deductive and inductive elements to contribute to the existing conversation around a topic while remaining open to potential unexpected findings. To help you make informed decisions about which qualitative data analysis approach fits with your research objectives, let's look at some of the common approaches for qualitative data analysis.

Content analysis is a research method used to identify patterns and themes within qualitative data. This approach involves systematically coding and categorizing specific aspects of the content in the data to uncover trends and patterns. An often important part of content analysis is quantifying frequencies and patterns of words or characteristics present in the data .

It is a highly flexible technique that can be adapted to various data types , including text, images, and audiovisual content . While content analysis can be exploratory in nature, it is also common to use pre-established theories and follow a more deductive approach to categorizing and quantifying the qualitative data.

what is data analysis procedures for qualitative research

Thematic analysis is a method used to identify, analyze, and report patterns or themes within the data. This approach moves beyond counting explicit words or phrases and focuses on also identifying implicit concepts and themes within the data.

what is data analysis procedures for qualitative research

Researchers conduct detailed coding of the data to ascertain repeated themes or patterns of meaning. Codes can be categorized into themes, and the researcher can analyze how the themes relate to one another. Thematic analysis is flexible in terms of the research framework, allowing for both inductive (data-driven) and deductive (theory-driven) approaches. The outcome is a rich, detailed, and complex account of the data.

Grounded theory is a systematic qualitative research methodology that is used to inductively generate theory that is 'grounded' in the data itself. Analysis takes place simultaneously with data collection , and researchers iterate between data collection and analysis until a comprehensive theory is developed.

Grounded theory is characterized by simultaneous data collection and analysis, the development of theoretical codes from the data, purposeful sampling of participants, and the constant comparison of data with emerging categories and concepts. The ultimate goal is to create a theoretical explanation that fits the data and answers the research question .

Discourse analysis is a qualitative research approach that emphasizes the role of language in social contexts. It involves examining communication and language use beyond the level of the sentence, considering larger units of language such as texts or conversations.

what is data analysis procedures for qualitative research

Discourse analysts typically investigate how social meanings and understandings are constructed in different contexts, emphasizing the connection between language and power. It can be applied to texts of all kinds, including interviews , documents, case studies , and social media posts.

Phenomenological research focuses on exploring how human beings make sense of an experience and delves into the essence of this experience. It strives to understand people's perceptions, perspectives, and understandings of a particular situation or phenomenon.

what is data analysis procedures for qualitative research

It involves in-depth engagement with participants, often through interviews or conversations, to explore their lived experiences. The goal is to derive detailed descriptions of the essence of the experience and to interpret what insights or implications this may bear on our understanding of this phenomenon.

what is data analysis procedures for qualitative research

Whatever your data analysis approach, start with ATLAS.ti

Qualitative data analysis done quickly and intuitively with ATLAS.ti. Download a free trial today.

Now that we've summarized the major approaches to data analysis, let's look at the broader process of research and data analysis. Suppose you need to do some research to find answers to any kind of research question, be it an academic inquiry, business problem, or policy decision. In that case, you need to collect some data. There are many methods of collecting data: you can collect primary data yourself by conducting interviews, focus groups , or a survey , for instance. Another option is to use secondary data sources. These are data previously collected for other projects, historical records, reports, statistics – basically everything that exists already and can be relevant to your research.

what is data analysis procedures for qualitative research

The data you collect should always be a good fit for your research question . For example, if you are interested in how many people in your target population like your brand compared to others, it is no use to conduct interviews or a few focus groups . The sample will be too small to get a representative picture of the population. If your questions are about "how many….", "what is the spread…" etc., you need to conduct quantitative research . If you are interested in why people like different brands, their motives, and their experiences, then conducting qualitative research can provide you with the answers you are looking for.

Let's describe the important steps involved in conducting research.

Step 1: Planning the research

As the saying goes: "Garbage in, garbage out." Suppose you find out after you have collected data that

  • you talked to the wrong people
  • asked the wrong questions
  • a couple of focus groups sessions would have yielded better results because of the group interaction, or
  • a survey including a few open-ended questions sent to a larger group of people would have been sufficient and required less effort.

Think thoroughly about sampling, the questions you will be asking, and in which form. If you conduct a focus group or an interview, you are the research instrument, and your data collection will only be as good as you are. If you have never done it before, seek some training and practice. If you have other people do it, make sure they have the skills.

what is data analysis procedures for qualitative research

Step 2: Preparing the data

When you conduct focus groups or interviews, think about how to transcribe them. Do you want to run them online or offline? If online, check out which tools can serve your needs, both in terms of functionality and cost. For any audio or video recordings , you can consider using automatic transcription software or services. Automatically generated transcripts can save you time and money, but they still need to be checked. If you don't do this yourself, make sure that you instruct the person doing it on how to prepare the data.

  • How should the final transcript be formatted for later analysis?
  • Which names and locations should be anonymized?
  • What kind of speaker IDs to use?

What about survey data ? Some survey data programs will immediately provide basic descriptive-level analysis of the responses. ATLAS.ti will support you with the analysis of the open-ended questions. For this, you need to export your data as an Excel file. ATLAS.ti's survey import wizard will guide you through the process.

Other kinds of data such as images, videos, audio recordings, text, and more can be imported to ATLAS.ti. You can organize all your data into groups and write comments on each source of data to maintain a systematic organization and documentation of your data.

what is data analysis procedures for qualitative research

Step 3: Exploratory data analysis

You can run a few simple exploratory analyses to get to know your data. For instance, you can create a word list or word cloud of all your text data or compare and contrast the words in different documents. You can also let ATLAS.ti find relevant concepts for you. There are many tools available that can automatically code your text data, so you can also use these codings to explore your data and refine your coding.

what is data analysis procedures for qualitative research

For instance, you can get a feeling for the sentiments expressed in the data. Who is more optimistic, pessimistic, or neutral in their responses? ATLAS.ti can auto-code the positive, negative, and neutral sentiments in your data. Naturally, you can also simply browse through your data and highlight relevant segments that catch your attention or attach codes to begin condensing the data.

what is data analysis procedures for qualitative research

Step 4: Build a code system

Whether you start with auto-coding or manual coding, after having generated some first codes, you need to get some order in your code system to develop a cohesive understanding. You can build your code system by sorting codes into groups and creating categories and subcodes. As this process requires reading and re-reading your data, you will become very familiar with your data. Counting on a tool like ATLAS.ti qualitative data analysis software will support you in the process and make it easier to review your data, modify codings if necessary, change code labels, and write operational definitions to explain what each code means.

what is data analysis procedures for qualitative research

Step 5: Query your coded data and write up the analysis

Once you have coded your data, it is time to take the analysis a step further. When using software for qualitative data analysis , it is easy to compare and contrast subsets in your data, such as groups of participants or sets of themes.

what is data analysis procedures for qualitative research

For instance, you can query the various opinions of female vs. male respondents. Is there a difference between consumers from rural or urban areas or among different age groups or educational levels? Which codes occur together throughout the data set? Are there relationships between various concepts, and if so, why?

Step 6: Data visualization

Data visualization brings your data to life. It is a powerful way of seeing patterns and relationships in your data. For instance, diagrams allow you to see how your codes are distributed across documents or specific subpopulations in your data.

what is data analysis procedures for qualitative research

Exploring coded data on a canvas, moving around code labels in a virtual space, linking codes and other elements of your data set, and thinking about how they are related and why – all of these will advance your analysis and spur further insights. Visuals are also great for communicating results to others.

Step 7: Data presentation

The final step is to summarize the analysis in a written report . You can now put together the memos you have written about the various topics, select some salient quotes that illustrate your writing, and add visuals such as tables and diagrams. If you follow the steps above, you will already have all the building blocks, and you just have to put them together in a report or presentation.

When preparing a report or a presentation, keep your audience in mind. Does your audience better understand numbers than long sections of detailed interpretations? If so, add more tables, charts, and short supportive data quotes to your report or presentation. If your audience loves a good interpretation, add your full-length memos and walk your audience through your conceptual networks and illustrative data quotes.

what is data analysis procedures for qualitative research

Qualitative data analysis begins with ATLAS.ti

For tools that can make the most out of your data, check out ATLAS.ti with a free trial.

Analyst Answers

Data & Finance for Work & Life

man doing qualitative research

Data Analysis for Qualitative Research: 6 Step Guide

Data analysis for qualitative research is not intuitive. This is because qualitative data stands in opposition to traditional data analysis methodologies: while data analysis is concerned with quantities, qualitative data is by definition unquantified . But there is an easy, methodical approach that anyone can take use to get reliable results when performing data analysis for qualitative research. The process consists of 6 steps that I’ll break down in this article:

  • Perform interviews(if necessary )
  • Gather all documents and transcribe any non-paper records
  • Decide whether to either code analytical data, analyze word frequencies, or both
  • Decide what interpretive angle you want to take: content analysis , narrative analysis, discourse analysis, framework analysis, and/or grounded theory
  • Compile your data in a spreadsheet using document saving techniques (windows and mac)
  • Identify trends in words, themes, metaphors, natural patterns, and more

To complete these steps, you will need:

  • Microsoft word
  • Microsoft excel
  • Internet access

You can get the free Intro to Data Analysis eBook to cover the fundamentals and ensure strong progression in all your data endeavors.

What is qualitative research?

Qualitative research is not the same as quantitative research. In short, qualitative research is the interpretation of non-numeric data. It usually aims at drawing conclusions that explain why a phenomenon occurs, rather than that one does occur. Here’s a great quote from a nursing magazine about quantitative vs qualitative research:

“A traditional quantitative study… uses a predetermined (and auditable) set of steps to confirm or refute [a] hypothesis. “In contrast, qualitative research often takes the position that an interpretive understanding is only possible by way of uncovering or deconstructing the meanings of a phenomenon. Thus, a distinction between explaining how something operates (explanation) and why it operates in the manner that it does (interpretation) may be [an] effective way to distinguish quantitative from qualitative analytic processes involved in any particular study.” (bold added) (( EBN ))

Learn to Interpret Your Qualitative Data

This article explain what data analysis is and how to do it. To learn how to interpret the results, visualize, and write an insightful report, sign up for our handbook below.

what is data analysis procedures for qualitative research

Step 1a: Data collection methods and techniques in qualitative research: interviews and focus groups

Step 1 is collecting the data that you will need for the analysis. If you are not performing any interviews or focus groups to gather data, then you can skip this step. It’s for people who need to go into the field and collect raw information as part of their qualitative analysis.

Since the whole point of an interview and of qualitative analysis in general is to understand a research question better, you should start by making sure you have a specific, refined research question . Whether you’re a researcher by trade or a data analyst working on one-time project, you must know specifically what you want to understand in order to get results.

Good research questions are specific enough to guide action but open enough to leave room for insight and growth. Examples of good research questions include:

  • Good : To what degree does living in a city impact the quality of a person’s life? (open-ended, complex)
  • Bad : Does living in a city impact the quality of a person’s life? (closed, simple)

Once you understand the research question, you need to develop a list of interview questions. These questions should likewise be open-ended and provide liberty of expression to the responder. They should support the research question in an active way without prejudicing the response. Examples of good interview questions include:

  • Good : Tell me what it’s like to live in a city versus in the country. (open, not leading)
  • Bad : Don’t you prefer the city to the country because there are more people? (closed, leading)

Some additional helpful tips include:

  • Begin each interview with a neutral question to get the person relaxed
  • Limit each question to a single idea
  • If you don’t understand, ask for clarity
  • Do not pass any judgements
  • Do not spend more than 15m on an interview, lest the quality of responses drop

Focus groups

The alternative to interviews is focus groups. Focus groups are a great way for you to get an idea for how people communicate their opinions in a group setting, rather than a one-on-one setting as in interviews.

In short, focus groups are gatherings of small groups of people from representative backgrounds who receive instruction, or “facilitation,” from a focus group leader. Typically, the leader will ask questions to stimulate conversation, reformulate questions to bring the discussion back to focus, and prevent the discussion from turning sour or giving way to bad faith.

Focus group questions should be open-ended like their interview neighbors, and they should stimulate some degree of disagreement. Disagreement often leads to valuable information about differing opinions, as people tend to say what they mean if contradicted.

However, focus group leaders must be careful not to let disagreements escalate, as anger can make people lie to be hurtful or simply to win an argument. And lies are not helpful in data analysis for qualitative research.

Step 1b: Tools for qualitative data collection

When it comes to data analysis for qualitative analysis, the tools you use to collect data should align to some degree with the tools you will use to analyze the data.

As mentioned in the intro, you will be focusing on analysis techniques that only require the traditional Microsoft suite programs: Microsoft Excel and Microsoft Word . At the same time, you can source supplementary tools from various websites, like Text Analyzer and WordCounter.

In short, the tools for qualitative data collection that you need are Excel and Word , as well as web-based free tools like Text Analyzer and WordCounter . These online tools are helpful in the quantitative part of your qualitative research.

Step 2: Gather all documents & transcribe non-written docs

Once you have your interviews and/or focus group transcripts, it’s time to decide if you need other documentation. If you do, you’ll need to gather it all into one place first, then develop a strategy for how to transcribe any non-written documents.

When do you need documentation other than interviews and focus groups? Two situations usually call for documentation. First , if you have little funding , then you can’t afford to run expensive interviews and focus groups.

Second , social science researchers typically focus on documents since their research questions are less concerned with subject-oriented data, while hard science and business researchers typically focus on interviews and focus groups because they want to know what people think, and they want to know today.

Non-written records

Other factors at play include the type of research, the field, and specific research goal. For those who need documentation and to describe non-written records, there are some steps to follow:

  • Put all hard copy source documents into a sealed binder (I use plastic paper holders with elastic seals ).
  • If you are sourcing directly from printed books or journals, then you will need to digitalize them by scanning them and making them text readable by the computer. To do so, turn all PDFs into Word documents using online tools such as PDF to Word Converter . This process is never full-proof, and it may be a source of error in the data collection, but it’s part of the process.
  • If you are sourcing online documents, try as often as possible to get computer-readable PDF documents that you can easily copy/paste or convert. Locked PDFs are essentially a lost cause .
  • Transcribe any audio files into written documents. There are free online tools available to help with this, such as 360converter . If you run a test through the system, you’ll see that the output is not 100%. The best way to use this tool is as a first draft generator. You can then correct and complete it with old fashioned, direct transcription.

Step 3: Decide on the type of qualitative research

Before step 3 you should have collected your data, transcribed it all into written-word documents, and compiled it in one place. Now comes the interesting part. You need to decide what you want to get out of your research by choosing an analytic angle, or type of qualitative research.

The available types of qualitative research are as follows. Each of them takes a unique angle that you must choose to get what information you want from the analysis . In addition, each of them has a different impact on the data analysis for qualitative research (coding vs word frequency) that we use.

Content analysis

Narrative analysis, discourse analysis.

  • Framework analysis, and/or

Grounded theory

From a high level, content, narrative, and discourse analysis are actionable independent tactics, whereas framework analysis and grounded theory are ways of honing and applying the first three.

  • Definition : Content analysis is identify and labelling themes of any kind within a text.
  • Focus : Identifying any kind of pattern in written text, transcribed audio, or transcribed video. This could be thematic, word repetition, idea repetition. Most often, the patterns we find are idea that make up an argument.
  • Goal : To simplify, standardize, and quickly reference ideas from any given text. Content analysis is a way to pull the main ideas from huge documents for comparison. In this way, it’s more a means to an end.
  • Pros : The huge advantage of doing content analysis is that you can quickly process huge amounts of texts using simple coding and word frequency techniques we will look at below. To use a metaphore, it is to qualitative analysis documents what Spark notes are to books.
  • Cons : The downside to content analysis is that it’s quite general. If you have a very specific, narrative research question, then tracing “any and all ideas” will not be very helpful to you.
  • Definition : Narrative analysis is the reformulation and simplification of interview answers or documentation into small narrative components to identify story-like patterns.
  • Focus : Understanding the text based on its narrative components as opposed to themes or other qualities.
  • Goal : To reference the text from an angle closer to the nature of texts in order to obtain further insights.
  • Pros : Narrative analysis is very useful for getting perspective on a topic in which you’re extremely limited. It can be easy to get tunnel vision when you’re digging for themes and ideas from a reason-centric perspective. Turning to a narrative approach will help you stay grounded. More importantly, it helps reveal different kinds of trends.
  • Cons : Narrative analysis adds another layer of subjectivity to the instinctive nature of qualitative research. Many see it as too dependent on the researcher to hold any critical value.
  • Definition : Discourse analysis is the textual analysis of naturally occurring speech. Any oral expression must be transcribed before undergoing legitimate discourse analysis.
  • Focus : Understanding ideas and themes through language communicated orally rather than pre-processed on paper.
  • Goal : To obtain insights from an angle outside the traditional content analysis on text.
  • Pros : Provides a considerable advantage in some areas of study in order to understand how people communicate an idea, versus the idea itself. For example, discourse analysis is important in political campaigning. People rarely vote for the candidate who most closely corresponds to his/her beliefs, but rather for the person they like the most.
  • Cons : As with narrative analysis, discourse analysis is more subjective in nature than content analysis, which focuses on ideas and patterns. Some do not consider it rigorous enough to be considered a legitimate subset of qualitative analysis, but these people are few.

Framework analysis

  • Definition : Framework analysis is a kind of qualitative analysis that includes 5 ordered steps: coding, indexing, charting, mapping, and interpreting . In most ways, framework analysis is a synonym for qualitative analysis — the same thing. The significant difference is the importance it places on the perspective used in the analysis.
  • Focus : Understanding patterns in themes and ideas.
  • Goal : Creating one specific framework for looking at a text.
  • Pros : Framework analysis is helpful when the researcher clearly understands what he/she wants from the project, as it’s a limitation approach. Since each of its step has defined parameters, framework analysis is very useful for teamwork.
  • Cons : It can lead to tunnel vision.
  • Definition : The use of content, narrative, and discourse analysis to examine a single case, in the hopes that discoveries from that case will lead to a foundational theory used to examine other like cases.
  • Focus : A vast approach using multiple techniques in order to establish patterns.
  • Goal : To develop a foundational theory.
  • Pros : When successful, grounded theories can revolutionize entire fields of study.
  • Cons : It’s very difficult to establish ground theories, and there’s an enormous amount of risk involved.

Step 4: Coding, word frequency, or both

Coding in data analysis for qualitative research is the process of writing 2-5 word codes that summarize at least 1 paragraphs of text (not writing computer code). This allows researchers to keep track of and analyze those codes. On the other hand, word frequency is the process of counting the presence and orientation of words within a text, which makes it the quantitative element in qualitative data analysis.

Video example of coding for data analysis in qualitative research

In short, coding in the context of data analysis for qualitative research follows 2 steps (video below):

  • Reading through the text one time
  • Adding 2-5 word summaries each time a significant theme or idea appears

Let’s look at a brief example of how to code for qualitative research in this video:

Click here for a link to the source text. 1

Example of word frequency processing

And word frequency is the process of finding a specific word or identifying the most common words through 3 steps:

  • Decide if you want to find 1 word or identify the most common ones
  • Use word’s “Replace” function to find a word or phrase
  • Use Text Analyzer to find the most common terms

Here’s another look at word frequency processing and how you to do it. Let’s look at the same example above, but from a quantitative perspective.

Imagine we are already familiar with melanoma and KITs , and we want to analyze the text based on these keywords. One thing we can do is look for these words using the Replace function in word

  • Locate the search bar
  • Click replace
  • Type in the word
  • See the total results

Here’s a brief video example:

Another option is to use an online Text Analyzer. This methodology won’t help us find a specific word, but it will help us discover the top performing phrases and words. All you need to do it put in a link to a target page or paste a text. I pasted the abstract from our source text, and what turns up is as expected. Here’s a picture:

text analyzer example

Step 5: Compile your data in a spreadsheet

After you have some coded data in the word document, you need to get it into excel for analysis. This process requires saving the word doc as an .htm extension, which makes it a website. Once you have the website, it’s as simple as opening that page, scrolling to the bottom, and copying/pasting the comments, or codes, into an excel document.

You will need to wrangle the data slightly in order to make it readable in excel. I’ve made a video to explain this process and places it below.

Step 6: Identify trends & analyze!

There are literally thousands of different ways to analyze qualitative data, and in most situations, the best technique depends on the information you want to get out of the research.

Nevertheless, there are a few go-to techniques. The most important of this is occurrences . In this short video, we finish the example from above by counting the number of times our codes appear. In this way, it’s very similar to word frequency (discussed above).

A few other options include:

  • Ranking each code on a set of relevant criteria and clustering
  • Pure cluster analysis
  • Causal analysis

We cover different types of analysis like this on the website, so be sure to check out other articles on the home page .

How to analyze qualitative data from an interview

To analyze qualitative data from an interview , follow the same 6 steps for quantitative data analysis:

  • Perform the interviews
  • Transcribe the interviews onto paper
  • Decide whether to either code analytical data (open, axial, selective), analyze word frequencies, or both
  • Compile your data in a spreadsheet using document saving techniques (for windows and mac)
  • Source text [ ↩ ]

About the Author

Noah is the founder & Editor-in-Chief at AnalystAnswers. He is a transatlantic professional and entrepreneur with 5+ years of corporate finance and data analytics experience, as well as 3+ years in consumer financial products and business software. He started AnalystAnswers to provide aspiring professionals with accessible explanations of otherwise dense finance and data concepts. Noah believes everyone can benefit from an analytical mindset in growing digital world. When he's not busy at work, Noah likes to explore new European cities, exercise, and spend time with friends and family.

File available immediately.

what is data analysis procedures for qualitative research

Notice: JavaScript is required for this content.

what is data analysis procedures for qualitative research

Research-Methodology

Qualitative Data Analysis

Qualitative data refers to non-numeric information such as interview transcripts, notes, video and audio recordings, images and text documents. Qualitative data analysis can be divided into the following five categories:

1. Content analysis . This refers to the process of categorizing verbal or behavioural data to classify, summarize and tabulate the data.

2. Narrative analysis . This method involves the reformulation of stories presented by respondents taking into account context of each case and different experiences of each respondent. In other words, narrative analysis is the revision of primary qualitative data by researcher.

3. Discourse analysis . A method of analysis of naturally occurring talk and all types of written text.

4. Framework analysis . This is more advanced method that consists of several stages such as familiarization, identifying a thematic framework, coding, charting, mapping and interpretation.

5. Grounded theory . This method of qualitative data analysis starts with an analysis of a single case to formulate a theory. Then, additional cases are examined to see if they contribute to the theory.

Qualitative data analysis can be conducted through the following three steps:

Step 1: Developing and Applying Codes . Coding can be explained as categorization of data. A ‘code’ can be a word or a short phrase that represents a theme or an idea. All codes need to be assigned meaningful titles. A wide range of non-quantifiable elements such as events, behaviours, activities, meanings etc. can be coded.

There are three types of coding:

  • Open coding . The initial organization of raw data to try to make sense of it.
  • Axial coding . Interconnecting and linking the categories of codes.
  • Selective coding . Formulating the story through connecting the categories.

Coding can be done manually or using qualitative data analysis software such as

 NVivo,  Atlas ti 6.0,  HyperRESEARCH 2.8,  Max QDA and others.

When using manual coding you can use folders, filing cabinets, wallets etc. to gather together materials that are examples of similar themes or analytic ideas. Manual method of coding in qualitative data analysis is rightly considered as labour-intensive, time-consuming and outdated.

In computer-based coding, on the other hand, physical files and cabinets are replaced with computer based directories and files. When choosing software for qualitative data analysis you need to consider a wide range of factors such as the type and amount of data you need to analyse, time required to master the software and cost considerations.

Moreover, it is important to get confirmation from your dissertation supervisor prior to application of any specific qualitative data analysis software.

The following table contains examples of research titles, elements to be coded and identification of relevant codes:

 Qualitative data coding

Step 2: Identifying themes, patterns and relationships . Unlike quantitative methods , in qualitative data analysis there are no universally applicable techniques that can be applied to generate findings. Analytical and critical thinking skills of researcher plays significant role in data analysis in qualitative studies. Therefore, no qualitative study can be repeated to generate the same results.

Nevertheless, there is a set of techniques that you can use to identify common themes, patterns and relationships within responses of sample group members in relation to codes that have been specified in the previous stage.

Specifically, the most popular and effective methods of qualitative data interpretation include the following:

  • Word and phrase repetitions – scanning primary data for words and phrases most commonly used by respondents, as well as, words and phrases used with unusual emotions;
  • Primary and secondary data comparisons – comparing the findings of interview/focus group/observation/any other qualitative data collection method with the findings of literature review and discussing differences between them;
  • Search for missing information – discussions about which aspects of the issue was not mentioned by respondents, although you expected them to be mentioned;
  • Metaphors and analogues – comparing primary research findings to phenomena from a different area and discussing similarities and differences.

Step 3: Summarizing the data . At this last stage you need to link research findings to hypotheses or research aim and objectives. When writing data analysis chapter, you can use noteworthy quotations from the transcript in order to highlight major themes within findings and possible contradictions.

It is important to note that the process of qualitative data analysis described above is general and different types of qualitative studies may require slightly different methods of data analysis.

My  e-book,  The Ultimate Guide to Writing a Dissertation in Business Studies: a step by step approach  contains a detailed, yet simple explanation of qualitative data analysis methods . The e-book explains all stages of the research process starting from the selection of the research area to writing personal reflection. Important elements of dissertations such as research philosophy, research approach, research design, methods of data collection and data analysis are explained in simple words. John Dudovskiy

Qualitative Data Analysis

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Can J Hosp Pharm
  • v.68(3); May-Jun 2015

Logo of cjhp

Qualitative Research: Data Collection, Analysis, and Management

Introduction.

In an earlier paper, 1 we presented an introduction to using qualitative research methods in pharmacy practice. In this article, we review some principles of the collection, analysis, and management of qualitative data to help pharmacists interested in doing research in their practice to continue their learning in this area. Qualitative research can help researchers to access the thoughts and feelings of research participants, which can enable development of an understanding of the meaning that people ascribe to their experiences. Whereas quantitative research methods can be used to determine how many people undertake particular behaviours, qualitative methods can help researchers to understand how and why such behaviours take place. Within the context of pharmacy practice research, qualitative approaches have been used to examine a diverse array of topics, including the perceptions of key stakeholders regarding prescribing by pharmacists and the postgraduation employment experiences of young pharmacists (see “Further Reading” section at the end of this article).

In the previous paper, 1 we outlined 3 commonly used methodologies: ethnography 2 , grounded theory 3 , and phenomenology. 4 Briefly, ethnography involves researchers using direct observation to study participants in their “real life” environment, sometimes over extended periods. Grounded theory and its later modified versions (e.g., Strauss and Corbin 5 ) use face-to-face interviews and interactions such as focus groups to explore a particular research phenomenon and may help in clarifying a less-well-understood problem, situation, or context. Phenomenology shares some features with grounded theory (such as an exploration of participants’ behaviour) and uses similar techniques to collect data, but it focuses on understanding how human beings experience their world. It gives researchers the opportunity to put themselves in another person’s shoes and to understand the subjective experiences of participants. 6 Some researchers use qualitative methodologies but adopt a different standpoint, and an example of this appears in the work of Thurston and others, 7 discussed later in this paper.

Qualitative work requires reflection on the part of researchers, both before and during the research process, as a way of providing context and understanding for readers. When being reflexive, researchers should not try to simply ignore or avoid their own biases (as this would likely be impossible); instead, reflexivity requires researchers to reflect upon and clearly articulate their position and subjectivities (world view, perspectives, biases), so that readers can better understand the filters through which questions were asked, data were gathered and analyzed, and findings were reported. From this perspective, bias and subjectivity are not inherently negative but they are unavoidable; as a result, it is best that they be articulated up-front in a manner that is clear and coherent for readers.

THE PARTICIPANT’S VIEWPOINT

What qualitative study seeks to convey is why people have thoughts and feelings that might affect the way they behave. Such study may occur in any number of contexts, but here, we focus on pharmacy practice and the way people behave with regard to medicines use (e.g., to understand patients’ reasons for nonadherence with medication therapy or to explore physicians’ resistance to pharmacists’ clinical suggestions). As we suggested in our earlier article, 1 an important point about qualitative research is that there is no attempt to generalize the findings to a wider population. Qualitative research is used to gain insights into people’s feelings and thoughts, which may provide the basis for a future stand-alone qualitative study or may help researchers to map out survey instruments for use in a quantitative study. It is also possible to use different types of research in the same study, an approach known as “mixed methods” research, and further reading on this topic may be found at the end of this paper.

The role of the researcher in qualitative research is to attempt to access the thoughts and feelings of study participants. This is not an easy task, as it involves asking people to talk about things that may be very personal to them. Sometimes the experiences being explored are fresh in the participant’s mind, whereas on other occasions reliving past experiences may be difficult. However the data are being collected, a primary responsibility of the researcher is to safeguard participants and their data. Mechanisms for such safeguarding must be clearly articulated to participants and must be approved by a relevant research ethics review board before the research begins. Researchers and practitioners new to qualitative research should seek advice from an experienced qualitative researcher before embarking on their project.

DATA COLLECTION

Whatever philosophical standpoint the researcher is taking and whatever the data collection method (e.g., focus group, one-to-one interviews), the process will involve the generation of large amounts of data. In addition to the variety of study methodologies available, there are also different ways of making a record of what is said and done during an interview or focus group, such as taking handwritten notes or video-recording. If the researcher is audio- or video-recording data collection, then the recordings must be transcribed verbatim before data analysis can begin. As a rough guide, it can take an experienced researcher/transcriber 8 hours to transcribe one 45-minute audio-recorded interview, a process than will generate 20–30 pages of written dialogue.

Many researchers will also maintain a folder of “field notes” to complement audio-taped interviews. Field notes allow the researcher to maintain and comment upon impressions, environmental contexts, behaviours, and nonverbal cues that may not be adequately captured through the audio-recording; they are typically handwritten in a small notebook at the same time the interview takes place. Field notes can provide important context to the interpretation of audio-taped data and can help remind the researcher of situational factors that may be important during data analysis. Such notes need not be formal, but they should be maintained and secured in a similar manner to audio tapes and transcripts, as they contain sensitive information and are relevant to the research. For more information about collecting qualitative data, please see the “Further Reading” section at the end of this paper.

DATA ANALYSIS AND MANAGEMENT

If, as suggested earlier, doing qualitative research is about putting oneself in another person’s shoes and seeing the world from that person’s perspective, the most important part of data analysis and management is to be true to the participants. It is their voices that the researcher is trying to hear, so that they can be interpreted and reported on for others to read and learn from. To illustrate this point, consider the anonymized transcript excerpt presented in Appendix 1 , which is taken from a research interview conducted by one of the authors (J.S.). We refer to this excerpt throughout the remainder of this paper to illustrate how data can be managed, analyzed, and presented.

Interpretation of Data

Interpretation of the data will depend on the theoretical standpoint taken by researchers. For example, the title of the research report by Thurston and others, 7 “Discordant indigenous and provider frames explain challenges in improving access to arthritis care: a qualitative study using constructivist grounded theory,” indicates at least 2 theoretical standpoints. The first is the culture of the indigenous population of Canada and the place of this population in society, and the second is the social constructivist theory used in the constructivist grounded theory method. With regard to the first standpoint, it can be surmised that, to have decided to conduct the research, the researchers must have felt that there was anecdotal evidence of differences in access to arthritis care for patients from indigenous and non-indigenous backgrounds. With regard to the second standpoint, it can be surmised that the researchers used social constructivist theory because it assumes that behaviour is socially constructed; in other words, people do things because of the expectations of those in their personal world or in the wider society in which they live. (Please see the “Further Reading” section for resources providing more information about social constructivist theory and reflexivity.) Thus, these 2 standpoints (and there may have been others relevant to the research of Thurston and others 7 ) will have affected the way in which these researchers interpreted the experiences of the indigenous population participants and those providing their care. Another standpoint is feminist standpoint theory which, among other things, focuses on marginalized groups in society. Such theories are helpful to researchers, as they enable us to think about things from a different perspective. Being aware of the standpoints you are taking in your own research is one of the foundations of qualitative work. Without such awareness, it is easy to slip into interpreting other people’s narratives from your own viewpoint, rather than that of the participants.

To analyze the example in Appendix 1 , we will adopt a phenomenological approach because we want to understand how the participant experienced the illness and we want to try to see the experience from that person’s perspective. It is important for the researcher to reflect upon and articulate his or her starting point for such analysis; for example, in the example, the coder could reflect upon her own experience as a female of a majority ethnocultural group who has lived within middle class and upper middle class settings. This personal history therefore forms the filter through which the data will be examined. This filter does not diminish the quality or significance of the analysis, since every researcher has his or her own filters; however, by explicitly stating and acknowledging what these filters are, the researcher makes it easer for readers to contextualize the work.

Transcribing and Checking

For the purposes of this paper it is assumed that interviews or focus groups have been audio-recorded. As mentioned above, transcribing is an arduous process, even for the most experienced transcribers, but it must be done to convert the spoken word to the written word to facilitate analysis. For anyone new to conducting qualitative research, it is beneficial to transcribe at least one interview and one focus group. It is only by doing this that researchers realize how difficult the task is, and this realization affects their expectations when asking others to transcribe. If the research project has sufficient funding, then a professional transcriber can be hired to do the work. If this is the case, then it is a good idea to sit down with the transcriber, if possible, and talk through the research and what the participants were talking about. This background knowledge for the transcriber is especially important in research in which people are using jargon or medical terms (as in pharmacy practice). Involving your transcriber in this way makes the work both easier and more rewarding, as he or she will feel part of the team. Transcription editing software is also available, but it is expensive. For example, ELAN (more formally known as EUDICO Linguistic Annotator, developed at the Technical University of Berlin) 8 is a tool that can help keep data organized by linking media and data files (particularly valuable if, for example, video-taping of interviews is complemented by transcriptions). It can also be helpful in searching complex data sets. Products such as ELAN do not actually automatically transcribe interviews or complete analyses, and they do require some time and effort to learn; nonetheless, for some research applications, it may be a valuable to consider such software tools.

All audio recordings should be transcribed verbatim, regardless of how intelligible the transcript may be when it is read back. Lines of text should be numbered. Once the transcription is complete, the researcher should read it while listening to the recording and do the following: correct any spelling or other errors; anonymize the transcript so that the participant cannot be identified from anything that is said (e.g., names, places, significant events); insert notations for pauses, laughter, looks of discomfort; insert any punctuation, such as commas and full stops (periods) (see Appendix 1 for examples of inserted punctuation), and include any other contextual information that might have affected the participant (e.g., temperature or comfort of the room).

Dealing with the transcription of a focus group is slightly more difficult, as multiple voices are involved. One way of transcribing such data is to “tag” each voice (e.g., Voice A, Voice B). In addition, the focus group will usually have 2 facilitators, whose respective roles will help in making sense of the data. While one facilitator guides participants through the topic, the other can make notes about context and group dynamics. More information about group dynamics and focus groups can be found in resources listed in the “Further Reading” section.

Reading between the Lines

During the process outlined above, the researcher can begin to get a feel for the participant’s experience of the phenomenon in question and can start to think about things that could be pursued in subsequent interviews or focus groups (if appropriate). In this way, one participant’s narrative informs the next, and the researcher can continue to interview until nothing new is being heard or, as it says in the text books, “saturation is reached”. While continuing with the processes of coding and theming (described in the next 2 sections), it is important to consider not just what the person is saying but also what they are not saying. For example, is a lengthy pause an indication that the participant is finding the subject difficult, or is the person simply deciding what to say? The aim of the whole process from data collection to presentation is to tell the participants’ stories using exemplars from their own narratives, thus grounding the research findings in the participants’ lived experiences.

Smith 9 suggested a qualitative research method known as interpretative phenomenological analysis, which has 2 basic tenets: first, that it is rooted in phenomenology, attempting to understand the meaning that individuals ascribe to their lived experiences, and second, that the researcher must attempt to interpret this meaning in the context of the research. That the researcher has some knowledge and expertise in the subject of the research means that he or she can have considerable scope in interpreting the participant’s experiences. Larkin and others 10 discussed the importance of not just providing a description of what participants say. Rather, interpretative phenomenological analysis is about getting underneath what a person is saying to try to truly understand the world from his or her perspective.

Once all of the research interviews have been transcribed and checked, it is time to begin coding. Field notes compiled during an interview can be a useful complementary source of information to facilitate this process, as the gap in time between an interview, transcribing, and coding can result in memory bias regarding nonverbal or environmental context issues that may affect interpretation of data.

Coding refers to the identification of topics, issues, similarities, and differences that are revealed through the participants’ narratives and interpreted by the researcher. This process enables the researcher to begin to understand the world from each participant’s perspective. Coding can be done by hand on a hard copy of the transcript, by making notes in the margin or by highlighting and naming sections of text. More commonly, researchers use qualitative research software (e.g., NVivo, QSR International Pty Ltd; www.qsrinternational.com/products_nvivo.aspx ) to help manage their transcriptions. It is advised that researchers undertake a formal course in the use of such software or seek supervision from a researcher experienced in these tools.

Returning to Appendix 1 and reading from lines 8–11, a code for this section might be “diagnosis of mental health condition”, but this would just be a description of what the participant is talking about at that point. If we read a little more deeply, we can ask ourselves how the participant might have come to feel that the doctor assumed he or she was aware of the diagnosis or indeed that they had only just been told the diagnosis. There are a number of pauses in the narrative that might suggest the participant is finding it difficult to recall that experience. Later in the text, the participant says “nobody asked me any questions about my life” (line 19). This could be coded simply as “health care professionals’ consultation skills”, but that would not reflect how the participant must have felt never to be asked anything about his or her personal life, about the participant as a human being. At the end of this excerpt, the participant just trails off, recalling that no-one showed any interest, which makes for very moving reading. For practitioners in pharmacy, it might also be pertinent to explore the participant’s experience of akathisia and why this was left untreated for 20 years.

One of the questions that arises about qualitative research relates to the reliability of the interpretation and representation of the participants’ narratives. There are no statistical tests that can be used to check reliability and validity as there are in quantitative research. However, work by Lincoln and Guba 11 suggests that there are other ways to “establish confidence in the ‘truth’ of the findings” (p. 218). They call this confidence “trustworthiness” and suggest that there are 4 criteria of trustworthiness: credibility (confidence in the “truth” of the findings), transferability (showing that the findings have applicability in other contexts), dependability (showing that the findings are consistent and could be repeated), and confirmability (the extent to which the findings of a study are shaped by the respondents and not researcher bias, motivation, or interest).

One way of establishing the “credibility” of the coding is to ask another researcher to code the same transcript and then to discuss any similarities and differences in the 2 resulting sets of codes. This simple act can result in revisions to the codes and can help to clarify and confirm the research findings.

Theming refers to the drawing together of codes from one or more transcripts to present the findings of qualitative research in a coherent and meaningful way. For example, there may be examples across participants’ narratives of the way in which they were treated in hospital, such as “not being listened to” or “lack of interest in personal experiences” (see Appendix 1 ). These may be drawn together as a theme running through the narratives that could be named “the patient’s experience of hospital care”. The importance of going through this process is that at its conclusion, it will be possible to present the data from the interviews using quotations from the individual transcripts to illustrate the source of the researchers’ interpretations. Thus, when the findings are organized for presentation, each theme can become the heading of a section in the report or presentation. Underneath each theme will be the codes, examples from the transcripts, and the researcher’s own interpretation of what the themes mean. Implications for real life (e.g., the treatment of people with chronic mental health problems) should also be given.

DATA SYNTHESIS

In this final section of this paper, we describe some ways of drawing together or “synthesizing” research findings to represent, as faithfully as possible, the meaning that participants ascribe to their life experiences. This synthesis is the aim of the final stage of qualitative research. For most readers, the synthesis of data presented by the researcher is of crucial significance—this is usually where “the story” of the participants can be distilled, summarized, and told in a manner that is both respectful to those participants and meaningful to readers. There are a number of ways in which researchers can synthesize and present their findings, but any conclusions drawn by the researchers must be supported by direct quotations from the participants. In this way, it is made clear to the reader that the themes under discussion have emerged from the participants’ interviews and not the mind of the researcher. The work of Latif and others 12 gives an example of how qualitative research findings might be presented.

Planning and Writing the Report

As has been suggested above, if researchers code and theme their material appropriately, they will naturally find the headings for sections of their report. Qualitative researchers tend to report “findings” rather than “results”, as the latter term typically implies that the data have come from a quantitative source. The final presentation of the research will usually be in the form of a report or a paper and so should follow accepted academic guidelines. In particular, the article should begin with an introduction, including a literature review and rationale for the research. There should be a section on the chosen methodology and a brief discussion about why qualitative methodology was most appropriate for the study question and why one particular methodology (e.g., interpretative phenomenological analysis rather than grounded theory) was selected to guide the research. The method itself should then be described, including ethics approval, choice of participants, mode of recruitment, and method of data collection (e.g., semistructured interviews or focus groups), followed by the research findings, which will be the main body of the report or paper. The findings should be written as if a story is being told; as such, it is not necessary to have a lengthy discussion section at the end. This is because much of the discussion will take place around the participants’ quotes, such that all that is needed to close the report or paper is a summary, limitations of the research, and the implications that the research has for practice. As stated earlier, it is not the intention of qualitative research to allow the findings to be generalized, and therefore this is not, in itself, a limitation.

Planning out the way that findings are to be presented is helpful. It is useful to insert the headings of the sections (the themes) and then make a note of the codes that exemplify the thoughts and feelings of your participants. It is generally advisable to put in the quotations that you want to use for each theme, using each quotation only once. After all this is done, the telling of the story can begin as you give your voice to the experiences of the participants, writing around their quotations. Do not be afraid to draw assumptions from the participants’ narratives, as this is necessary to give an in-depth account of the phenomena in question. Discuss these assumptions, drawing on your participants’ words to support you as you move from one code to another and from one theme to the next. Finally, as appropriate, it is possible to include examples from literature or policy documents that add support for your findings. As an exercise, you may wish to code and theme the sample excerpt in Appendix 1 and tell the participant’s story in your own way. Further reading about “doing” qualitative research can be found at the end of this paper.

CONCLUSIONS

Qualitative research can help researchers to access the thoughts and feelings of research participants, which can enable development of an understanding of the meaning that people ascribe to their experiences. It can be used in pharmacy practice research to explore how patients feel about their health and their treatment. Qualitative research has been used by pharmacists to explore a variety of questions and problems (see the “Further Reading” section for examples). An understanding of these issues can help pharmacists and other health care professionals to tailor health care to match the individual needs of patients and to develop a concordant relationship. Doing qualitative research is not easy and may require a complete rethink of how research is conducted, particularly for researchers who are more familiar with quantitative approaches. There are many ways of conducting qualitative research, and this paper has covered some of the practical issues regarding data collection, analysis, and management. Further reading around the subject will be essential to truly understand this method of accessing peoples’ thoughts and feelings to enable researchers to tell participants’ stories.

Appendix 1. Excerpt from a sample transcript

The participant (age late 50s) had suffered from a chronic mental health illness for 30 years. The participant had become a “revolving door patient,” someone who is frequently in and out of hospital. As the participant talked about past experiences, the researcher asked:

  • What was treatment like 30 years ago?
  • Umm—well it was pretty much they could do what they wanted with you because I was put into the er, the er kind of system er, I was just on
  • endless section threes.
  • Really…
  • But what I didn’t realize until later was that if you haven’t actually posed a threat to someone or yourself they can’t really do that but I didn’t know
  • that. So wh-when I first went into hospital they put me on the forensic ward ’cause they said, “We don’t think you’ll stay here we think you’ll just
  • run-run away.” So they put me then onto the acute admissions ward and – er – I can remember one of the first things I recall when I got onto that
  • ward was sitting down with a er a Dr XXX. He had a book this thick [gestures] and on each page it was like three questions and he went through
  • all these questions and I answered all these questions. So we’re there for I don’t maybe two hours doing all that and he asked me he said “well
  • when did somebody tell you then that you have schizophrenia” I said “well nobody’s told me that” so he seemed very surprised but nobody had
  • actually [pause] whe-when I first went up there under police escort erm the senior kind of consultants people I’d been to where I was staying and
  • ermm so er [pause] I . . . the, I can remember the very first night that I was there and given this injection in this muscle here [gestures] and just
  • having dreadful side effects the next day I woke up [pause]
  • . . . and I suffered that akathesia I swear to you, every minute of every day for about 20 years.
  • Oh how awful.
  • And that side of it just makes life impossible so the care on the wards [pause] umm I don’t know it’s kind of, it’s kind of hard to put into words
  • [pause]. Because I’m not saying they were sort of like not friendly or interested but then nobody ever seemed to want to talk about your life [pause]
  • nobody asked me any questions about my life. The only questions that came into was they asked me if I’d be a volunteer for these student exams
  • and things and I said “yeah” so all the questions were like “oh what jobs have you done,” er about your relationships and things and er but
  • nobody actually sat down and had a talk and showed some interest in you as a person you were just there basically [pause] um labelled and you
  • know there was there was [pause] but umm [pause] yeah . . .

This article is the 10th in the CJHP Research Primer Series, an initiative of the CJHP Editorial Board and the CSHP Research Committee. The planned 2-year series is intended to appeal to relatively inexperienced researchers, with the goal of building research capacity among practising pharmacists. The articles, presenting simple but rigorous guidance to encourage and support novice researchers, are being solicited from authors with appropriate expertise.

Previous articles in this series:

Bond CM. The research jigsaw: how to get started. Can J Hosp Pharm . 2014;67(1):28–30.

Tully MP. Research: articulating questions, generating hypotheses, and choosing study designs. Can J Hosp Pharm . 2014;67(1):31–4.

Loewen P. Ethical issues in pharmacy practice research: an introductory guide. Can J Hosp Pharm. 2014;67(2):133–7.

Tsuyuki RT. Designing pharmacy practice research trials. Can J Hosp Pharm . 2014;67(3):226–9.

Bresee LC. An introduction to developing surveys for pharmacy practice research. Can J Hosp Pharm . 2014;67(4):286–91.

Gamble JM. An introduction to the fundamentals of cohort and case–control studies. Can J Hosp Pharm . 2014;67(5):366–72.

Austin Z, Sutton J. Qualitative research: getting started. C an J Hosp Pharm . 2014;67(6):436–40.

Houle S. An introduction to the fundamentals of randomized controlled trials in pharmacy research. Can J Hosp Pharm . 2014; 68(1):28–32.

Charrois TL. Systematic reviews: What do you need to know to get started? Can J Hosp Pharm . 2014;68(2):144–8.

Competing interests: None declared.

Further Reading

Examples of qualitative research in pharmacy practice.

  • Farrell B, Pottie K, Woodend K, Yao V, Dolovich L, Kennie N, et al. Shifts in expectations: evaluating physicians’ perceptions as pharmacists integrated into family practice. J Interprof Care. 2010; 24 (1):80–9. [ PubMed ] [ Google Scholar ]
  • Gregory P, Austin Z. Postgraduation employment experiences of new pharmacists in Ontario in 2012–2013. Can Pharm J. 2014; 147 (5):290–9. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Marks PZ, Jennnings B, Farrell B, Kennie-Kaulbach N, Jorgenson D, Pearson-Sharpe J, et al. “I gained a skill and a change in attitude”: a case study describing how an online continuing professional education course for pharmacists supported achievement of its transfer to practice outcomes. Can J Univ Contin Educ. 2014; 40 (2):1–18. [ Google Scholar ]
  • Nair KM, Dolovich L, Brazil K, Raina P. It’s all about relationships: a qualitative study of health researchers’ perspectives on interdisciplinary research. BMC Health Serv Res. 2008; 8 :110. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Pojskic N, MacKeigan L, Boon H, Austin Z. Initial perceptions of key stakeholders in Ontario regarding independent prescriptive authority for pharmacists. Res Soc Adm Pharm. 2014; 10 (2):341–54. [ PubMed ] [ Google Scholar ]

Qualitative Research in General

  • Breakwell GM, Hammond S, Fife-Schaw C. Research methods in psychology. Thousand Oaks (CA): Sage Publications; 1995. [ Google Scholar ]
  • Given LM. 100 questions (and answers) about qualitative research. Thousand Oaks (CA): Sage Publications; 2015. [ Google Scholar ]
  • Miles B, Huberman AM. Qualitative data analysis. Thousand Oaks (CA): Sage Publications; 2009. [ Google Scholar ]
  • Patton M. Qualitative research and evaluation methods. Thousand Oaks (CA): Sage Publications; 2002. [ Google Scholar ]
  • Willig C. Introducing qualitative research in psychology. Buckingham (UK): Open University Press; 2001. [ Google Scholar ]

Group Dynamics in Focus Groups

  • Farnsworth J, Boon B. Analysing group dynamics within the focus group. Qual Res. 2010; 10 (5):605–24. [ Google Scholar ]

Social Constructivism

  • Social constructivism. Berkeley (CA): University of California, Berkeley, Berkeley Graduate Division, Graduate Student Instruction Teaching & Resource Center; [cited 2015 June 4]. Available from: http://gsi.berkeley.edu/gsi-guide-contents/learning-theory-research/social-constructivism/ [ Google Scholar ]

Mixed Methods

  • Creswell J. Research design: qualitative, quantitative, and mixed methods approaches. Thousand Oaks (CA): Sage Publications; 2009. [ Google Scholar ]

Collecting Qualitative Data

  • Arksey H, Knight P. Interviewing for social scientists: an introductory resource with examples. Thousand Oaks (CA): Sage Publications; 1999. [ Google Scholar ]
  • Guest G, Namey EE, Mitchel ML. Collecting qualitative data: a field manual for applied research. Thousand Oaks (CA): Sage Publications; 2013. [ Google Scholar ]

Constructivist Grounded Theory

  • Charmaz K. Grounded theory: objectivist and constructivist methods. In: Denzin N, Lincoln Y, editors. Handbook of qualitative research. 2nd ed. Thousand Oaks (CA): Sage Publications; 2000. pp. 509–35. [ Google Scholar ]
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer
  • QuestionPro

survey software icon

  • Solutions Industries Gaming Automotive Sports and events Education Government Travel & Hospitality Financial Services Healthcare Cannabis Technology Use Case NPS+ Communities Audience Contactless surveys Mobile LivePolls Member Experience GDPR Positive People Science 360 Feedback Surveys
  • Resources Blog eBooks Survey Templates Case Studies Training Help center

what is data analysis procedures for qualitative research

Home Market Research

Data Analysis in Research: Types & Methods

data-analysis-in-research

Content Index

Why analyze data in research?

Types of data in research, finding patterns in the qualitative data, methods used for data analysis in qualitative research, preparing data for analysis, methods used for data analysis in quantitative research, considerations in research data analysis, what is data analysis in research.

Definition of research in data analysis: According to LeCompte and Schensul, research data analysis is a process used by researchers to reduce data to a story and interpret it to derive insights. The data analysis process helps reduce a large chunk of data into smaller fragments, which makes sense. 

Three essential things occur during the data analysis process — the first is data organization . Summarization and categorization together contribute to becoming the second known method used for data reduction. It helps find patterns and themes in the data for easy identification and linking. The third and last way is data analysis – researchers do it in both top-down and bottom-up fashion.

LEARN ABOUT: Research Process Steps

On the other hand, Marshall and Rossman describe data analysis as a messy, ambiguous, and time-consuming but creative and fascinating process through which a mass of collected data is brought to order, structure and meaning.

We can say that “the data analysis and data interpretation is a process representing the application of deductive and inductive logic to the research and data analysis.”

Researchers rely heavily on data as they have a story to tell or research problems to solve. It starts with a question, and data is nothing but an answer to that question. But, what if there is no question to ask? Well! It is possible to explore data even without a problem – we call it ‘Data Mining’, which often reveals some interesting patterns within the data that are worth exploring.

Irrelevant to the type of data researchers explore, their mission and audiences’ vision guide them to find the patterns to shape the story they want to tell. One of the essential things expected from researchers while analyzing data is to stay open and remain unbiased toward unexpected patterns, expressions, and results. Remember, sometimes, data analysis tells the most unforeseen yet exciting stories that were not expected when initiating data analysis. Therefore, rely on the data you have at hand and enjoy the journey of exploratory research. 

Create a Free Account

Every kind of data has a rare quality of describing things after assigning a specific value to it. For analysis, you need to organize these values, processed and presented in a given context, to make it useful. Data can be in different forms; here are the primary data types.

  • Qualitative data: When the data presented has words and descriptions, then we call it qualitative data . Although you can observe this data, it is subjective and harder to analyze data in research, especially for comparison. Example: Quality data represents everything describing taste, experience, texture, or an opinion that is considered quality data. This type of data is usually collected through focus groups, personal qualitative interviews , qualitative observation or using open-ended questions in surveys.
  • Quantitative data: Any data expressed in numbers of numerical figures are called quantitative data . This type of data can be distinguished into categories, grouped, measured, calculated, or ranked. Example: questions such as age, rank, cost, length, weight, scores, etc. everything comes under this type of data. You can present such data in graphical format, charts, or apply statistical analysis methods to this data. The (Outcomes Measurement Systems) OMS questionnaires in surveys are a significant source of collecting numeric data.
  • Categorical data: It is data presented in groups. However, an item included in the categorical data cannot belong to more than one group. Example: A person responding to a survey by telling his living style, marital status, smoking habit, or drinking habit comes under the categorical data. A chi-square test is a standard method used to analyze this data.

Learn More : Examples of Qualitative Data in Education

Data analysis in qualitative research

Data analysis and qualitative data research work a little differently from the numerical data as the quality data is made up of words, descriptions, images, objects, and sometimes symbols. Getting insight from such complicated information is a complicated process. Hence it is typically used for exploratory research and data analysis .

Although there are several ways to find patterns in the textual information, a word-based method is the most relied and widely used global technique for research and data analysis. Notably, the data analysis process in qualitative research is manual. Here the researchers usually read the available data and find repetitive or commonly used words. 

For example, while studying data collected from African countries to understand the most pressing issues people face, researchers might find  “food”  and  “hunger” are the most commonly used words and will highlight them for further analysis.

LEARN ABOUT: Level of Analysis

The keyword context is another widely used word-based technique. In this method, the researcher tries to understand the concept by analyzing the context in which the participants use a particular keyword.  

For example , researchers conducting research and data analysis for studying the concept of ‘diabetes’ amongst respondents might analyze the context of when and how the respondent has used or referred to the word ‘diabetes.’

The scrutiny-based technique is also one of the highly recommended  text analysis  methods used to identify a quality data pattern. Compare and contrast is the widely used method under this technique to differentiate how a specific text is similar or different from each other. 

For example: To find out the “importance of resident doctor in a company,” the collected data is divided into people who think it is necessary to hire a resident doctor and those who think it is unnecessary. Compare and contrast is the best method that can be used to analyze the polls having single-answer questions types .

Metaphors can be used to reduce the data pile and find patterns in it so that it becomes easier to connect data with theory.

Variable Partitioning is another technique used to split variables so that researchers can find more coherent descriptions and explanations from the enormous data.

LEARN ABOUT: Qualitative Research Questions and Questionnaires

There are several techniques to analyze the data in qualitative research, but here are some commonly used methods,

  • Content Analysis:  It is widely accepted and the most frequently employed technique for data analysis in research methodology. It can be used to analyze the documented information from text, images, and sometimes from the physical items. It depends on the research questions to predict when and where to use this method.
  • Narrative Analysis: This method is used to analyze content gathered from various sources such as personal interviews, field observation, and  surveys . The majority of times, stories, or opinions shared by people are focused on finding answers to the research questions.
  • Discourse Analysis:  Similar to narrative analysis, discourse analysis is used to analyze the interactions with people. Nevertheless, this particular method considers the social context under which or within which the communication between the researcher and respondent takes place. In addition to that, discourse analysis also focuses on the lifestyle and day-to-day environment while deriving any conclusion.
  • Grounded Theory:  When you want to explain why a particular phenomenon happened, then using grounded theory for analyzing quality data is the best resort. Grounded theory is applied to study data about the host of similar cases occurring in different settings. When researchers are using this method, they might alter explanations or produce new ones until they arrive at some conclusion.

LEARN ABOUT: 12 Best Tools for Researchers

Data analysis in quantitative research

The first stage in research and data analysis is to make it for the analysis so that the nominal data can be converted into something meaningful. Data preparation consists of the below phases.

Phase I: Data Validation

Data validation is done to understand if the collected data sample is per the pre-set standards, or it is a biased data sample again divided into four different stages

  • Fraud: To ensure an actual human being records each response to the survey or the questionnaire
  • Screening: To make sure each participant or respondent is selected or chosen in compliance with the research criteria
  • Procedure: To ensure ethical standards were maintained while collecting the data sample
  • Completeness: To ensure that the respondent has answered all the questions in an online survey. Else, the interviewer had asked all the questions devised in the questionnaire.

Phase II: Data Editing

More often, an extensive research data sample comes loaded with errors. Respondents sometimes fill in some fields incorrectly or sometimes skip them accidentally. Data editing is a process wherein the researchers have to confirm that the provided data is free of such errors. They need to conduct necessary checks and outlier checks to edit the raw edit and make it ready for analysis.

Phase III: Data Coding

Out of all three, this is the most critical phase of data preparation associated with grouping and assigning values to the survey responses . If a survey is completed with a 1000 sample size, the researcher will create an age bracket to distinguish the respondents based on their age. Thus, it becomes easier to analyze small data buckets rather than deal with the massive data pile.

LEARN ABOUT: Steps in Qualitative Research

After the data is prepared for analysis, researchers are open to using different research and data analysis methods to derive meaningful insights. For sure, statistical analysis plans are the most favored to analyze numerical data. In statistical analysis, distinguishing between categorical data and numerical data is essential, as categorical data involves distinct categories or labels, while numerical data consists of measurable quantities. The method is again classified into two groups. First, ‘Descriptive Statistics’ used to describe data. Second, ‘Inferential statistics’ that helps in comparing the data .

Descriptive statistics

This method is used to describe the basic features of versatile types of data in research. It presents the data in such a meaningful way that pattern in the data starts making sense. Nevertheless, the descriptive analysis does not go beyond making conclusions. The conclusions are again based on the hypothesis researchers have formulated so far. Here are a few major types of descriptive analysis methods.

Measures of Frequency

  • Count, Percent, Frequency
  • It is used to denote home often a particular event occurs.
  • Researchers use it when they want to showcase how often a response is given.

Measures of Central Tendency

  • Mean, Median, Mode
  • The method is widely used to demonstrate distribution by various points.
  • Researchers use this method when they want to showcase the most commonly or averagely indicated response.

Measures of Dispersion or Variation

  • Range, Variance, Standard deviation
  • Here the field equals high/low points.
  • Variance standard deviation = difference between the observed score and mean
  • It is used to identify the spread of scores by stating intervals.
  • Researchers use this method to showcase data spread out. It helps them identify the depth until which the data is spread out that it directly affects the mean.

Measures of Position

  • Percentile ranks, Quartile ranks
  • It relies on standardized scores helping researchers to identify the relationship between different scores.
  • It is often used when researchers want to compare scores with the average count.

For quantitative research use of descriptive analysis often give absolute numbers, but the in-depth analysis is never sufficient to demonstrate the rationale behind those numbers. Nevertheless, it is necessary to think of the best method for research and data analysis suiting your survey questionnaire and what story researchers want to tell. For example, the mean is the best way to demonstrate the students’ average scores in schools. It is better to rely on the descriptive statistics when the researchers intend to keep the research or outcome limited to the provided  sample  without generalizing it. For example, when you want to compare average voting done in two different cities, differential statistics are enough.

Descriptive analysis is also called a ‘univariate analysis’ since it is commonly used to analyze a single variable.

Inferential statistics

Inferential statistics are used to make predictions about a larger population after research and data analysis of the representing population’s collected sample. For example, you can ask some odd 100 audiences at a movie theater if they like the movie they are watching. Researchers then use inferential statistics on the collected  sample  to reason that about 80-90% of people like the movie. 

Here are two significant areas of inferential statistics.

  • Estimating parameters: It takes statistics from the sample research data and demonstrates something about the population parameter.
  • Hypothesis test: I t’s about sampling research data to answer the survey research questions. For example, researchers might be interested to understand if the new shade of lipstick recently launched is good or not, or if the multivitamin capsules help children to perform better at games.

These are sophisticated analysis methods used to showcase the relationship between different variables instead of describing a single variable. It is often used when researchers want something beyond absolute numbers to understand the relationship between variables.

Here are some of the commonly used methods for data analysis in research.

  • Correlation: When researchers are not conducting experimental research or quasi-experimental research wherein the researchers are interested to understand the relationship between two or more variables, they opt for correlational research methods.
  • Cross-tabulation: Also called contingency tables,  cross-tabulation  is used to analyze the relationship between multiple variables.  Suppose provided data has age and gender categories presented in rows and columns. A two-dimensional cross-tabulation helps for seamless data analysis and research by showing the number of males and females in each age category.
  • Regression analysis: For understanding the strong relationship between two variables, researchers do not look beyond the primary and commonly used regression analysis method, which is also a type of predictive analysis used. In this method, you have an essential factor called the dependent variable. You also have multiple independent variables in regression analysis. You undertake efforts to find out the impact of independent variables on the dependent variable. The values of both independent and dependent variables are assumed as being ascertained in an error-free random manner.
  • Frequency tables: The statistical procedure is used for testing the degree to which two or more vary or differ in an experiment. A considerable degree of variation means research findings were significant. In many contexts, ANOVA testing and variance analysis are similar.
  • Analysis of variance: The statistical procedure is used for testing the degree to which two or more vary or differ in an experiment. A considerable degree of variation means research findings were significant. In many contexts, ANOVA testing and variance analysis are similar.
  • Researchers must have the necessary research skills to analyze and manipulation the data , Getting trained to demonstrate a high standard of research practice. Ideally, researchers must possess more than a basic understanding of the rationale of selecting one statistical method over the other to obtain better data insights.
  • Usually, research and data analytics projects differ by scientific discipline; therefore, getting statistical advice at the beginning of analysis helps design a survey questionnaire, select data collection methods , and choose samples.

LEARN ABOUT: Best Data Collection Tools

  • The primary aim of data research and analysis is to derive ultimate insights that are unbiased. Any mistake in or keeping a biased mind to collect data, selecting an analysis method, or choosing  audience  sample il to draw a biased inference.
  • Irrelevant to the sophistication used in research data and analysis is enough to rectify the poorly defined objective outcome measurements. It does not matter if the design is at fault or intentions are not clear, but lack of clarity might mislead readers, so avoid the practice.
  • The motive behind data analysis in research is to present accurate and reliable data. As far as possible, avoid statistical errors, and find a way to deal with everyday challenges like outliers, missing data, data altering, data mining , or developing graphical representation.

LEARN MORE: Descriptive Research vs Correlational Research The sheer amount of data generated daily is frightening. Especially when data analysis has taken center stage. in 2018. In last year, the total data supply amounted to 2.8 trillion gigabytes. Hence, it is clear that the enterprises willing to survive in the hypercompetitive world must possess an excellent capability to analyze complex research data, derive actionable insights, and adapt to the new market needs.

LEARN ABOUT: Average Order Value

QuestionPro is an online survey platform that empowers organizations in data analysis and research and provides them a medium to collect data by creating appealing surveys.

MORE LIKE THIS

data information vs insight

Data Information vs Insight: Essential differences

May 14, 2024

pricing analytics software

Pricing Analytics Software: Optimize Your Pricing Strategy

May 13, 2024

relationship marketing

Relationship Marketing: What It Is, Examples & Top 7 Benefits

May 8, 2024

email survey tool

The Best Email Survey Tool to Boost Your Feedback Game

May 7, 2024

Other categories

  • Academic Research
  • Artificial Intelligence
  • Assessments
  • Brand Awareness
  • Case Studies
  • Communities
  • Consumer Insights
  • Customer effort score
  • Customer Engagement
  • Customer Experience
  • Customer Loyalty
  • Customer Research
  • Customer Satisfaction
  • Employee Benefits
  • Employee Engagement
  • Employee Retention
  • Friday Five
  • General Data Protection Regulation
  • Insights Hub
  • Life@QuestionPro
  • Market Research
  • Mobile diaries
  • Mobile Surveys
  • New Features
  • Online Communities
  • Question Types
  • Questionnaire
  • QuestionPro Products
  • Release Notes
  • Research Tools and Apps
  • Revenue at Risk
  • Survey Templates
  • Training Tips
  • Uncategorized
  • Video Learning Series
  • What’s Coming Up
  • Workforce Intelligence

Log in using your username and password

  • Search More Search for this keyword Advanced search
  • Latest content
  • Current issue
  • Write for Us
  • BMJ Journals More You are viewing from: Google Indexer

You are here

  • Volume 17, Issue 1
  • Qualitative data analysis: a practical example
  • Article Text
  • Article info
  • Citation Tools
  • Rapid Responses
  • Article metrics

Download PDF

  • Helen Noble 1 ,
  • Joanna Smith 2
  • 1 School of Nursing and Midwifery, Queens's University Belfast , Belfast , UK
  • 2 Department of Health Sciences , University of Huddersfield , Huddersfield , UK
  • Correspondence to : Dr Helen Noble School of Nursing and Midwifery, Queen's University Belfast, Medical Biology Centre, 97 Lisburn Road, Belfast BT9 7BL, UK; helen.noble{at}qub.ac.uk

https://doi.org/10.1136/eb-2013-101603

Statistics from Altmetric.com

Request permissions.

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

The aim of this paper is to equip readers with an understanding of the principles of qualitative data analysis and offer a practical example of how analysis might be undertaken in an interview-based study.

What is qualitative data analysis?

What are the approaches in undertaking qualitative data analysis.

Although qualitative data analysis is inductive and focuses on meaning, approaches in analysing data are diverse with different purposes and ontological (concerned with the nature of being) and epistemological (knowledge and understanding) underpinnings. 2 Identifying an appropriate approach in analysing qualitative data analysis to meet the aim of a study can be challenging. One way to understand qualitative data analysis is to consider the processes involved. 3 Approaches can be divided into four broad groups: quasistatistical approaches such as content analysis; the use of frameworks or matrices such as a framework approach and thematic analysis; interpretative approaches that include interpretative phenomenological analysis and grounded theory; and sociolinguistic approaches such as discourse analysis and conversation analysis. However, there are commonalities across approaches. Data analysis is an interactive process, where data are systematically searched and analysed in order to provide an illuminating description of phenomena; for example, the experience of carers supporting dying patients with renal disease 4 or student nurses’ experiences following assignment referral. 5 Data analysis is an iterative or recurring process, essential to the creativity of the analysis, development of ideas, clarifying meaning and the reworking of concepts as new insights ‘emerge’ or are identified in the data.

Do you need data software packages when analysing qualitative data?

Qualitative data software packages are not a prerequisite for undertaking qualitative analysis but a range of programmes are available that can assist the qualitative researcher. Software programmes vary in design and application but can be divided into text retrievers, code and retrieve packages and theory builders. 6 NVivo and NUD*IST are widely used because they have sophisticated code and retrieve functions and modelling capabilities, which speed up the process of managing large data sets and data retrieval. Repetitions within data can be quantified and memos and hyperlinks attached to data. Analytical processes can be mapped and tracked and linkages across data visualised leading to theory development. 6 Disadvantages of using qualitative data software packages include the complexity of the software and some programmes are not compatible with standard text format. Extensive coding and categorising can result in data becoming unmanageable and researchers may find visualising data on screen inhibits conceptualisation of the data.

How do you begin analysing qualitative data?

Despite the diversity of qualitative methods, the subsequent analysis is based on a common set of principles and for interview data includes: transcribing the interviews; immersing oneself within the data to gain detailed insights into the phenomena being explored; developing a data coding system; and linking codes or units of data to form overarching themes/concepts, which may lead to the development of theory. 2 Identifying recurring and significant themes, whereby data are methodically searched to identify patterns in order to provide an illuminating description of a phenomenon, is a central skill in undertaking qualitative data analysis. Table 1 contains an extract of data taken from a research study which included interviews with carers of people with end-stage renal disease managed without dialysis. The extract is taken from a carer who is trying to understand why her mother was not offered dialysis. The first stage of data analysis involves the process of initial coding, whereby each line of the data is considered to identify keywords or phrases; these are sometimes known as in vivo codes (highlighted) because they retain participants’ words.

  • View inline

Data extract containing units of data and line-by-line coding

When transcripts have been broken down into manageable sections, the researcher sorts and sifts them, searching for types, classes, sequences, processes, patterns or wholes. The next stage of data analysis involves bringing similar categories together into broader themes. Table 2 provides an example of the early development of codes and categories and how these link to form broad initial themes.

Development of initial themes from descriptive codes

Table 3 presents an example of further category development leading to final themes which link to an overarching concept.

Development of final themes and overarching concept

How do qualitative researchers ensure data analysis procedures are transparent and robust?

In congruence with quantitative researchers, ensuring qualitative studies are methodologically robust is essential. Qualitative researchers need to be explicit in describing how and why they undertook the research. However, qualitative research is criticised for lacking transparency in relation to the analytical processes employed, which hinders the ability of the reader to critically appraise study findings. 7 In the three tables presented the progress from units of data to coding to theme development is illustrated. ‘Not involved in treatment decisions’ appears in each table and informs one of the final themes. Documenting the movement from units of data to final themes allows for transparency of data analysis. Although other researchers may interpret the data differently, appreciating and understanding how the themes were developed is an essential part of demonstrating the robustness of the findings. Qualitative researchers must demonstrate rigour, associated with openness, relevance to practice and congruence of the methodological approch. 2 In summary qualitative research is complex in that it produces large amounts of data and analysis is time consuming and complex. High-quality data analysis requires a researcher with expertise, vision and veracity.

  • Cheater F ,
  • Robshaw M ,
  • McLafferty E ,
  • Maggs-Rapport F

Competing interests None.

Read the full text or download the PDF:

Learn / Guides / Qualitative data analysis guide

Back to guides

5 qualitative data analysis methods

Qualitative data uncovers valuable insights that help you improve the user and customer experience. But how exactly do you measure and analyze data that isn't quantifiable?

There are different qualitative data analysis methods to help you make sense of qualitative feedback and customer insights, depending on your business goals and the type of data you've collected.

Before you choose a qualitative data analysis method for your team, you need to consider the available techniques and explore their use cases to understand how each process might help you better understand your users. 

This guide covers five qualitative analysis methods to choose from, and will help you pick the right one(s) based on your goals. 

Content analysis

Thematic analysis

Narrative analysis

Grounded theory analysis

Discourse analysis

5 qualitative data analysis methods explained

Qualitative data analysis ( QDA ) is the process of organizing, analyzing, and interpreting qualitative research data—non-numeric, conceptual information, and user feedback—to capture themes and patterns, answer research questions, and identify actions to improve your product or website.

Step 1 in the research process (after planning ) is qualitative data collection. You can use behavior analytics software—like Hotjar —to capture qualitative data with context, and learn the real motivation behind user behavior, by collecting written customer feedback with Surveys or scheduling an in-depth user interview with Engage .

Use Hotjar’s tools to collect feedback, uncover behavior trends, and understand the ‘why’ behind user actions.

1. Content analysis

Content analysis is a qualitative research method that examines and quantifies the presence of certain words, subjects, and concepts in text, image, video, or audio messages. The method transforms qualitative input into quantitative data to help you make reliable conclusions about what customers think of your brand, and how you can improve their experience and opinion.

Conduct content analysis manually (which can be time-consuming) or use analysis tools like Lexalytics to reveal communication patterns, uncover differences in individual or group communication trends, and make broader connections between concepts.

#Benefits and challenges of using content analysis

How content analysis can help your team

Content analysis is often used by marketers and customer service specialists, helping them understand customer behavior and measure brand reputation.

For example, you may run a customer survey with open-ended questions to discover users’ concerns—in their own words—about their experience with your product. Instead of having to process hundreds of answers manually, a content analysis tool helps you analyze and group results based on the emotion expressed in texts.

Some other examples of content analysis include:

Analyzing brand mentions on social media to understand your brand's reputation

Reviewing customer feedback to evaluate (and then improve) the customer and user experience (UX)

Researching competitors’ website pages to identify their competitive advantages and value propositions

Interpreting customer interviews and survey results to determine user preferences, and setting the direction for new product or feature developments

Content analysis was a major part of our growth during my time at Hypercontext.

[It gave us] a better understanding of the [blog] topics that performed best for signing new users up. We were also able to go deeper within those blog posts to better understand the formats [that worked].

2. Thematic analysis

Thematic analysis helps you identify, categorize, analyze, and interpret patterns in qualitative study data , and can be done with tools like Dovetail and Thematic .

While content analysis and thematic analysis seem similar, they're different in concept: 

Content analysis can be applied to both qualitative and quantitative data , and focuses on identifying frequencies and recurring words and subjects

Thematic analysis can only be applied to qualitative data, and focuses on identifying patterns and themes

#The benefits and drawbacks of thematic analysis

How thematic analysis can help your team

Thematic analysis can be used by pretty much anyone: from product marketers, to customer relationship managers, to UX researchers.

For example, product teams use thematic analysis to better understand user behaviors and needs and improve UX . Analyzing customer feedback lets you identify themes (e.g. poor navigation or a buggy mobile interface) highlighted by users and get actionable insight into what they really expect from the product. 

💡 Pro tip: looking for a way to expedite the data analysis process for large amounts of data you collected with a survey? Try Hotjar’s AI for Surveys : along with generating a survey based on your goal in seconds, our AI will analyze the raw data and prepare an automated summary report that presents key thematic findings, respondent quotes, and actionable steps to take, making the analysis of qualitative data a breeze.

3. Narrative analysis

Narrative analysis is a method used to interpret research participants’ stories —things like testimonials , case studies, focus groups, interviews, and other text or visual data—with tools like Delve and AI-powered ATLAS.ti .

Some formats don’t work well with narrative analysis, including heavily structured interviews and written surveys, which don’t give participants as much opportunity to tell their stories in their own words.

#Benefits and challenges of narrative analysis

How narrative analysis can help your team

Narrative analysis provides product teams with valuable insight into the complexity of customers’ lives, feelings, and behaviors.

In a marketing research context, narrative analysis involves capturing and reviewing customer stories—on social media, for example—to get in-depth insight into their lives, priorities, and challenges. 

This might look like analyzing daily content shared by your audiences’ favorite influencers on Instagram, or analyzing customer reviews on sites like G2 or Capterra to gain a deep understanding of individual customer experiences. The results of this analysis also contribute to developing corresponding customer personas .

💡 Pro tip: conducting user interviews is an excellent way to collect data for narrative analysis. Though interviews can be time-intensive, there are tools out there that streamline the workload. 

Hotjar Engage automates the entire process, from recruiting to scheduling to generating the all-important interview transcripts you’ll need for the analysis phase of your research project.

4. Grounded theory analysis

Grounded theory analysis is a method of conducting qualitative research to develop theories by examining real-world data. This technique involves the creation of hypotheses and theories through qualitative data collection and evaluation, and can be performed with qualitative data analysis software tools like MAXQDA and NVivo .

Unlike other qualitative data analysis techniques, this method is inductive rather than deductive: it develops theories from data, not the other way around.

#The benefits and challenges of grounded theory analysis

How grounded theory analysis can help your team

Grounded theory analysis is used by software engineers, product marketers, managers, and other specialists who deal with data sets to make informed business decisions. 

For example, product marketing teams may turn to customer surveys to understand the reasons behind high churn rates , then use grounded theory to analyze responses and develop hypotheses about why users churn, and how you can get them to stay. 

Grounded theory can also be helpful in the talent management process. For example, HR representatives may use it to develop theories about low employee engagement, and come up with solutions based on their research findings.

5. Discourse analysis

Discourse analysis is the act of researching the underlying meaning of qualitative data. It involves the observation of texts, audio, and videos to study the relationships between information and its social context.

In contrast to content analysis, this method focuses on the contextual meaning of language: discourse analysis sheds light on what audiences think of a topic, and why they feel the way they do about it.

#Benefits and challenges of discourse analysis

How discourse analysis can help your team

In a business context, this method is primarily used by marketing teams. Discourse analysis helps marketers understand the norms and ideas in their market , and reveals why they play such a significant role for their customers. 

Once the origins of trends are uncovered, it’s easier to develop a company mission, create a unique tone of voice, and craft effective marketing messages.

Which qualitative data analysis method should you choose?

While the five qualitative data analysis methods we list above are all aimed at processing data and answering research questions, these techniques differ in their intent and the approaches applied.  

Choosing the right analysis method for your team isn't a matter of preference—selecting a method that fits is only possible once you define your research goals and have a clear intention. When you know what you need (and why you need it), you can identify an analysis method that aligns with your research objectives.

Gather qualitative data with Hotjar

Use Hotjar’s product experience insights in your qualitative research. Collect feedback, uncover behavior trends, and understand the ‘why’ behind user actions.

FAQs about qualitative data analysis methods

What is the qualitative data analysis approach.

The qualitative data analysis approach refers to the process of systematizing descriptive data collected through interviews, focus groups, surveys, and observations and then interpreting it. The methodology aims to identify patterns and themes behind textual data, and other unquantifiable data, as opposed to numerical data.

What are qualitative data analysis methods?

Five popular qualitative data analysis methods are:

What is the process of qualitative data analysis?

The process of qualitative data analysis includes six steps:

Define your research question

Prepare the data

Choose the method of qualitative analysis

Code the data

Identify themes, patterns, and relationships

Make hypotheses and act

Qualitative data analysis guide

Previous chapter

QDA challenges

Next chapter

Your Modern Business Guide To Data Analysis Methods And Techniques

Data analysis methods and techniques blog post by datapine

Table of Contents

1) What Is Data Analysis?

2) Why Is Data Analysis Important?

3) What Is The Data Analysis Process?

4) Types Of Data Analysis Methods

5) Top Data Analysis Techniques To Apply

6) Quality Criteria For Data Analysis

7) Data Analysis Limitations & Barriers

8) Data Analysis Skills

9) Data Analysis In The Big Data Environment

In our data-rich age, understanding how to analyze and extract true meaning from our business’s digital insights is one of the primary drivers of success.

Despite the colossal volume of data we create every day, a mere 0.5% is actually analyzed and used for data discovery , improvement, and intelligence. While that may not seem like much, considering the amount of digital information we have at our fingertips, half a percent still accounts for a vast amount of data.

With so much data and so little time, knowing how to collect, curate, organize, and make sense of all of this potentially business-boosting information can be a minefield – but online data analysis is the solution.

In science, data analysis uses a more complex approach with advanced techniques to explore and experiment with data. On the other hand, in a business context, data is used to make data-driven decisions that will enable the company to improve its overall performance. In this post, we will cover the analysis of data from an organizational point of view while still going through the scientific and statistical foundations that are fundamental to understanding the basics of data analysis. 

To put all of that into perspective, we will answer a host of important analytical questions, explore analytical methods and techniques, while demonstrating how to perform analysis in the real world with a 17-step blueprint for success.

What Is Data Analysis?

Data analysis is the process of collecting, modeling, and analyzing data using various statistical and logical methods and techniques. Businesses rely on analytics processes and tools to extract insights that support strategic and operational decision-making.

All these various methods are largely based on two core areas: quantitative and qualitative research.

To explain the key differences between qualitative and quantitative research, here’s a video for your viewing pleasure:

Gaining a better understanding of different techniques and methods in quantitative research as well as qualitative insights will give your analyzing efforts a more clearly defined direction, so it’s worth taking the time to allow this particular knowledge to sink in. Additionally, you will be able to create a comprehensive analytical report that will skyrocket your analysis.

Apart from qualitative and quantitative categories, there are also other types of data that you should be aware of before dividing into complex data analysis processes. These categories include: 

  • Big data: Refers to massive data sets that need to be analyzed using advanced software to reveal patterns and trends. It is considered to be one of the best analytical assets as it provides larger volumes of data at a faster rate. 
  • Metadata: Putting it simply, metadata is data that provides insights about other data. It summarizes key information about specific data that makes it easier to find and reuse for later purposes. 
  • Real time data: As its name suggests, real time data is presented as soon as it is acquired. From an organizational perspective, this is the most valuable data as it can help you make important decisions based on the latest developments. Our guide on real time analytics will tell you more about the topic. 
  • Machine data: This is more complex data that is generated solely by a machine such as phones, computers, or even websites and embedded systems, without previous human interaction.

Why Is Data Analysis Important?

Before we go into detail about the categories of analysis along with its methods and techniques, you must understand the potential that analyzing data can bring to your organization.

  • Informed decision-making : From a management perspective, you can benefit from analyzing your data as it helps you make decisions based on facts and not simple intuition. For instance, you can understand where to invest your capital, detect growth opportunities, predict your income, or tackle uncommon situations before they become problems. Through this, you can extract relevant insights from all areas in your organization, and with the help of dashboard software , present the data in a professional and interactive way to different stakeholders.
  • Reduce costs : Another great benefit is to reduce costs. With the help of advanced technologies such as predictive analytics, businesses can spot improvement opportunities, trends, and patterns in their data and plan their strategies accordingly. In time, this will help you save money and resources on implementing the wrong strategies. And not just that, by predicting different scenarios such as sales and demand you can also anticipate production and supply. 
  • Target customers better : Customers are arguably the most crucial element in any business. By using analytics to get a 360° vision of all aspects related to your customers, you can understand which channels they use to communicate with you, their demographics, interests, habits, purchasing behaviors, and more. In the long run, it will drive success to your marketing strategies, allow you to identify new potential customers, and avoid wasting resources on targeting the wrong people or sending the wrong message. You can also track customer satisfaction by analyzing your client’s reviews or your customer service department’s performance.

What Is The Data Analysis Process?

Data analysis process graphic

When we talk about analyzing data there is an order to follow in order to extract the needed conclusions. The analysis process consists of 5 key stages. We will cover each of them more in detail later in the post, but to start providing the needed context to understand what is coming next, here is a rundown of the 5 essential steps of data analysis. 

  • Identify: Before you get your hands dirty with data, you first need to identify why you need it in the first place. The identification is the stage in which you establish the questions you will need to answer. For example, what is the customer's perception of our brand? Or what type of packaging is more engaging to our potential customers? Once the questions are outlined you are ready for the next step. 
  • Collect: As its name suggests, this is the stage where you start collecting the needed data. Here, you define which sources of data you will use and how you will use them. The collection of data can come in different forms such as internal or external sources, surveys, interviews, questionnaires, and focus groups, among others.  An important note here is that the way you collect the data will be different in a quantitative and qualitative scenario. 
  • Clean: Once you have the necessary data it is time to clean it and leave it ready for analysis. Not all the data you collect will be useful, when collecting big amounts of data in different formats it is very likely that you will find yourself with duplicate or badly formatted data. To avoid this, before you start working with your data you need to make sure to erase any white spaces, duplicate records, or formatting errors. This way you avoid hurting your analysis with bad-quality data. 
  • Analyze : With the help of various techniques such as statistical analysis, regressions, neural networks, text analysis, and more, you can start analyzing and manipulating your data to extract relevant conclusions. At this stage, you find trends, correlations, variations, and patterns that can help you answer the questions you first thought of in the identify stage. Various technologies in the market assist researchers and average users with the management of their data. Some of them include business intelligence and visualization software, predictive analytics, and data mining, among others. 
  • Interpret: Last but not least you have one of the most important steps: it is time to interpret your results. This stage is where the researcher comes up with courses of action based on the findings. For example, here you would understand if your clients prefer packaging that is red or green, plastic or paper, etc. Additionally, at this stage, you can also find some limitations and work on them. 

Now that you have a basic understanding of the key data analysis steps, let’s look at the top 17 essential methods.

17 Essential Types Of Data Analysis Methods

Before diving into the 17 essential types of methods, it is important that we go over really fast through the main analysis categories. Starting with the category of descriptive up to prescriptive analysis, the complexity and effort of data evaluation increases, but also the added value for the company.

a) Descriptive analysis - What happened.

The descriptive analysis method is the starting point for any analytic reflection, and it aims to answer the question of what happened? It does this by ordering, manipulating, and interpreting raw data from various sources to turn it into valuable insights for your organization.

Performing descriptive analysis is essential, as it enables us to present our insights in a meaningful way. Although it is relevant to mention that this analysis on its own will not allow you to predict future outcomes or tell you the answer to questions like why something happened, it will leave your data organized and ready to conduct further investigations.

b) Exploratory analysis - How to explore data relationships.

As its name suggests, the main aim of the exploratory analysis is to explore. Prior to it, there is still no notion of the relationship between the data and the variables. Once the data is investigated, exploratory analysis helps you to find connections and generate hypotheses and solutions for specific problems. A typical area of ​​application for it is data mining.

c) Diagnostic analysis - Why it happened.

Diagnostic data analytics empowers analysts and executives by helping them gain a firm contextual understanding of why something happened. If you know why something happened as well as how it happened, you will be able to pinpoint the exact ways of tackling the issue or challenge.

Designed to provide direct and actionable answers to specific questions, this is one of the world’s most important methods in research, among its other key organizational functions such as retail analytics , e.g.

c) Predictive analysis - What will happen.

The predictive method allows you to look into the future to answer the question: what will happen? In order to do this, it uses the results of the previously mentioned descriptive, exploratory, and diagnostic analysis, in addition to machine learning (ML) and artificial intelligence (AI). Through this, you can uncover future trends, potential problems or inefficiencies, connections, and casualties in your data.

With predictive analysis, you can unfold and develop initiatives that will not only enhance your various operational processes but also help you gain an all-important edge over the competition. If you understand why a trend, pattern, or event happened through data, you will be able to develop an informed projection of how things may unfold in particular areas of the business.

e) Prescriptive analysis - How will it happen.

Another of the most effective types of analysis methods in research. Prescriptive data techniques cross over from predictive analysis in the way that it revolves around using patterns or trends to develop responsive, practical business strategies.

By drilling down into prescriptive analysis, you will play an active role in the data consumption process by taking well-arranged sets of visual data and using it as a powerful fix to emerging issues in a number of key areas, including marketing, sales, customer experience, HR, fulfillment, finance, logistics analytics , and others.

Top 17 data analysis methods

As mentioned at the beginning of the post, data analysis methods can be divided into two big categories: quantitative and qualitative. Each of these categories holds a powerful analytical value that changes depending on the scenario and type of data you are working with. Below, we will discuss 17 methods that are divided into qualitative and quantitative approaches. 

Without further ado, here are the 17 essential types of data analysis methods with some use cases in the business world: 

A. Quantitative Methods 

To put it simply, quantitative analysis refers to all methods that use numerical data or data that can be turned into numbers (e.g. category variables like gender, age, etc.) to extract valuable insights. It is used to extract valuable conclusions about relationships, differences, and test hypotheses. Below we discuss some of the key quantitative methods. 

1. Cluster analysis

The action of grouping a set of data elements in a way that said elements are more similar (in a particular sense) to each other than to those in other groups – hence the term ‘cluster.’ Since there is no target variable when clustering, the method is often used to find hidden patterns in the data. The approach is also used to provide additional context to a trend or dataset.

Let's look at it from an organizational perspective. In a perfect world, marketers would be able to analyze each customer separately and give them the best-personalized service, but let's face it, with a large customer base, it is timely impossible to do that. That's where clustering comes in. By grouping customers into clusters based on demographics, purchasing behaviors, monetary value, or any other factor that might be relevant for your company, you will be able to immediately optimize your efforts and give your customers the best experience based on their needs.

2. Cohort analysis

This type of data analysis approach uses historical data to examine and compare a determined segment of users' behavior, which can then be grouped with others with similar characteristics. By using this methodology, it's possible to gain a wealth of insight into consumer needs or a firm understanding of a broader target group.

Cohort analysis can be really useful for performing analysis in marketing as it will allow you to understand the impact of your campaigns on specific groups of customers. To exemplify, imagine you send an email campaign encouraging customers to sign up for your site. For this, you create two versions of the campaign with different designs, CTAs, and ad content. Later on, you can use cohort analysis to track the performance of the campaign for a longer period of time and understand which type of content is driving your customers to sign up, repurchase, or engage in other ways.  

A useful tool to start performing cohort analysis method is Google Analytics. You can learn more about the benefits and limitations of using cohorts in GA in this useful guide . In the bottom image, you see an example of how you visualize a cohort in this tool. The segments (devices traffic) are divided into date cohorts (usage of devices) and then analyzed week by week to extract insights into performance.

Cohort analysis chart example from google analytics

3. Regression analysis

Regression uses historical data to understand how a dependent variable's value is affected when one (linear regression) or more independent variables (multiple regression) change or stay the same. By understanding each variable's relationship and how it developed in the past, you can anticipate possible outcomes and make better decisions in the future.

Let's bring it down with an example. Imagine you did a regression analysis of your sales in 2019 and discovered that variables like product quality, store design, customer service, marketing campaigns, and sales channels affected the overall result. Now you want to use regression to analyze which of these variables changed or if any new ones appeared during 2020. For example, you couldn’t sell as much in your physical store due to COVID lockdowns. Therefore, your sales could’ve either dropped in general or increased in your online channels. Through this, you can understand which independent variables affected the overall performance of your dependent variable, annual sales.

If you want to go deeper into this type of analysis, check out this article and learn more about how you can benefit from regression.

4. Neural networks

The neural network forms the basis for the intelligent algorithms of machine learning. It is a form of analytics that attempts, with minimal intervention, to understand how the human brain would generate insights and predict values. Neural networks learn from each and every data transaction, meaning that they evolve and advance over time.

A typical area of application for neural networks is predictive analytics. There are BI reporting tools that have this feature implemented within them, such as the Predictive Analytics Tool from datapine. This tool enables users to quickly and easily generate all kinds of predictions. All you have to do is select the data to be processed based on your KPIs, and the software automatically calculates forecasts based on historical and current data. Thanks to its user-friendly interface, anyone in your organization can manage it; there’s no need to be an advanced scientist. 

Here is an example of how you can use the predictive analysis tool from datapine:

Example on how to use predictive analytics tool from datapine

**click to enlarge**

5. Factor analysis

The factor analysis also called “dimension reduction” is a type of data analysis used to describe variability among observed, correlated variables in terms of a potentially lower number of unobserved variables called factors. The aim here is to uncover independent latent variables, an ideal method for streamlining specific segments.

A good way to understand this data analysis method is a customer evaluation of a product. The initial assessment is based on different variables like color, shape, wearability, current trends, materials, comfort, the place where they bought the product, and frequency of usage. Like this, the list can be endless, depending on what you want to track. In this case, factor analysis comes into the picture by summarizing all of these variables into homogenous groups, for example, by grouping the variables color, materials, quality, and trends into a brother latent variable of design.

If you want to start analyzing data using factor analysis we recommend you take a look at this practical guide from UCLA.

6. Data mining

A method of data analysis that is the umbrella term for engineering metrics and insights for additional value, direction, and context. By using exploratory statistical evaluation, data mining aims to identify dependencies, relations, patterns, and trends to generate advanced knowledge.  When considering how to analyze data, adopting a data mining mindset is essential to success - as such, it’s an area that is worth exploring in greater detail.

An excellent use case of data mining is datapine intelligent data alerts . With the help of artificial intelligence and machine learning, they provide automated signals based on particular commands or occurrences within a dataset. For example, if you’re monitoring supply chain KPIs , you could set an intelligent alarm to trigger when invalid or low-quality data appears. By doing so, you will be able to drill down deep into the issue and fix it swiftly and effectively.

In the following picture, you can see how the intelligent alarms from datapine work. By setting up ranges on daily orders, sessions, and revenues, the alarms will notify you if the goal was not completed or if it exceeded expectations.

Example on how to use intelligent alerts from datapine

7. Time series analysis

As its name suggests, time series analysis is used to analyze a set of data points collected over a specified period of time. Although analysts use this method to monitor the data points in a specific interval of time rather than just monitoring them intermittently, the time series analysis is not uniquely used for the purpose of collecting data over time. Instead, it allows researchers to understand if variables changed during the duration of the study, how the different variables are dependent, and how did it reach the end result. 

In a business context, this method is used to understand the causes of different trends and patterns to extract valuable insights. Another way of using this method is with the help of time series forecasting. Powered by predictive technologies, businesses can analyze various data sets over a period of time and forecast different future events. 

A great use case to put time series analysis into perspective is seasonality effects on sales. By using time series forecasting to analyze sales data of a specific product over time, you can understand if sales rise over a specific period of time (e.g. swimwear during summertime, or candy during Halloween). These insights allow you to predict demand and prepare production accordingly.  

8. Decision Trees 

The decision tree analysis aims to act as a support tool to make smart and strategic decisions. By visually displaying potential outcomes, consequences, and costs in a tree-like model, researchers and company users can easily evaluate all factors involved and choose the best course of action. Decision trees are helpful to analyze quantitative data and they allow for an improved decision-making process by helping you spot improvement opportunities, reduce costs, and enhance operational efficiency and production.

But how does a decision tree actually works? This method works like a flowchart that starts with the main decision that you need to make and branches out based on the different outcomes and consequences of each decision. Each outcome will outline its own consequences, costs, and gains and, at the end of the analysis, you can compare each of them and make the smartest decision. 

Businesses can use them to understand which project is more cost-effective and will bring more earnings in the long run. For example, imagine you need to decide if you want to update your software app or build a new app entirely.  Here you would compare the total costs, the time needed to be invested, potential revenue, and any other factor that might affect your decision.  In the end, you would be able to see which of these two options is more realistic and attainable for your company or research.

9. Conjoint analysis 

Last but not least, we have the conjoint analysis. This approach is usually used in surveys to understand how individuals value different attributes of a product or service and it is one of the most effective methods to extract consumer preferences. When it comes to purchasing, some clients might be more price-focused, others more features-focused, and others might have a sustainable focus. Whatever your customer's preferences are, you can find them with conjoint analysis. Through this, companies can define pricing strategies, packaging options, subscription packages, and more. 

A great example of conjoint analysis is in marketing and sales. For instance, a cupcake brand might use conjoint analysis and find that its clients prefer gluten-free options and cupcakes with healthier toppings over super sugary ones. Thus, the cupcake brand can turn these insights into advertisements and promotions to increase sales of this particular type of product. And not just that, conjoint analysis can also help businesses segment their customers based on their interests. This allows them to send different messaging that will bring value to each of the segments. 

10. Correspondence Analysis

Also known as reciprocal averaging, correspondence analysis is a method used to analyze the relationship between categorical variables presented within a contingency table. A contingency table is a table that displays two (simple correspondence analysis) or more (multiple correspondence analysis) categorical variables across rows and columns that show the distribution of the data, which is usually answers to a survey or questionnaire on a specific topic. 

This method starts by calculating an “expected value” which is done by multiplying row and column averages and dividing it by the overall original value of the specific table cell. The “expected value” is then subtracted from the original value resulting in a “residual number” which is what allows you to extract conclusions about relationships and distribution. The results of this analysis are later displayed using a map that represents the relationship between the different values. The closest two values are in the map, the bigger the relationship. Let’s put it into perspective with an example. 

Imagine you are carrying out a market research analysis about outdoor clothing brands and how they are perceived by the public. For this analysis, you ask a group of people to match each brand with a certain attribute which can be durability, innovation, quality materials, etc. When calculating the residual numbers, you can see that brand A has a positive residual for innovation but a negative one for durability. This means that brand A is not positioned as a durable brand in the market, something that competitors could take advantage of. 

11. Multidimensional Scaling (MDS)

MDS is a method used to observe the similarities or disparities between objects which can be colors, brands, people, geographical coordinates, and more. The objects are plotted using an “MDS map” that positions similar objects together and disparate ones far apart. The (dis) similarities between objects are represented using one or more dimensions that can be observed using a numerical scale. For example, if you want to know how people feel about the COVID-19 vaccine, you can use 1 for “don’t believe in the vaccine at all”  and 10 for “firmly believe in the vaccine” and a scale of 2 to 9 for in between responses.  When analyzing an MDS map the only thing that matters is the distance between the objects, the orientation of the dimensions is arbitrary and has no meaning at all. 

Multidimensional scaling is a valuable technique for market research, especially when it comes to evaluating product or brand positioning. For instance, if a cupcake brand wants to know how they are positioned compared to competitors, it can define 2-3 dimensions such as taste, ingredients, shopping experience, or more, and do a multidimensional scaling analysis to find improvement opportunities as well as areas in which competitors are currently leading. 

Another business example is in procurement when deciding on different suppliers. Decision makers can generate an MDS map to see how the different prices, delivery times, technical services, and more of the different suppliers differ and pick the one that suits their needs the best. 

A final example proposed by a research paper on "An Improved Study of Multilevel Semantic Network Visualization for Analyzing Sentiment Word of Movie Review Data". Researchers picked a two-dimensional MDS map to display the distances and relationships between different sentiments in movie reviews. They used 36 sentiment words and distributed them based on their emotional distance as we can see in the image below where the words "outraged" and "sweet" are on opposite sides of the map, marking the distance between the two emotions very clearly.

Example of multidimensional scaling analysis

Aside from being a valuable technique to analyze dissimilarities, MDS also serves as a dimension-reduction technique for large dimensional data. 

B. Qualitative Methods

Qualitative data analysis methods are defined as the observation of non-numerical data that is gathered and produced using methods of observation such as interviews, focus groups, questionnaires, and more. As opposed to quantitative methods, qualitative data is more subjective and highly valuable in analyzing customer retention and product development.

12. Text analysis

Text analysis, also known in the industry as text mining, works by taking large sets of textual data and arranging them in a way that makes it easier to manage. By working through this cleansing process in stringent detail, you will be able to extract the data that is truly relevant to your organization and use it to develop actionable insights that will propel you forward.

Modern software accelerate the application of text analytics. Thanks to the combination of machine learning and intelligent algorithms, you can perform advanced analytical processes such as sentiment analysis. This technique allows you to understand the intentions and emotions of a text, for example, if it's positive, negative, or neutral, and then give it a score depending on certain factors and categories that are relevant to your brand. Sentiment analysis is often used to monitor brand and product reputation and to understand how successful your customer experience is. To learn more about the topic check out this insightful article .

By analyzing data from various word-based sources, including product reviews, articles, social media communications, and survey responses, you will gain invaluable insights into your audience, as well as their needs, preferences, and pain points. This will allow you to create campaigns, services, and communications that meet your prospects’ needs on a personal level, growing your audience while boosting customer retention. There are various other “sub-methods” that are an extension of text analysis. Each of them serves a more specific purpose and we will look at them in detail next. 

13. Content Analysis

This is a straightforward and very popular method that examines the presence and frequency of certain words, concepts, and subjects in different content formats such as text, image, audio, or video. For example, the number of times the name of a celebrity is mentioned on social media or online tabloids. It does this by coding text data that is later categorized and tabulated in a way that can provide valuable insights, making it the perfect mix of quantitative and qualitative analysis.

There are two types of content analysis. The first one is the conceptual analysis which focuses on explicit data, for instance, the number of times a concept or word is mentioned in a piece of content. The second one is relational analysis, which focuses on the relationship between different concepts or words and how they are connected within a specific context. 

Content analysis is often used by marketers to measure brand reputation and customer behavior. For example, by analyzing customer reviews. It can also be used to analyze customer interviews and find directions for new product development. It is also important to note, that in order to extract the maximum potential out of this analysis method, it is necessary to have a clearly defined research question. 

14. Thematic Analysis

Very similar to content analysis, thematic analysis also helps in identifying and interpreting patterns in qualitative data with the main difference being that the first one can also be applied to quantitative analysis. The thematic method analyzes large pieces of text data such as focus group transcripts or interviews and groups them into themes or categories that come up frequently within the text. It is a great method when trying to figure out peoples view’s and opinions about a certain topic. For example, if you are a brand that cares about sustainability, you can do a survey of your customers to analyze their views and opinions about sustainability and how they apply it to their lives. You can also analyze customer service calls transcripts to find common issues and improve your service. 

Thematic analysis is a very subjective technique that relies on the researcher’s judgment. Therefore,  to avoid biases, it has 6 steps that include familiarization, coding, generating themes, reviewing themes, defining and naming themes, and writing up. It is also important to note that, because it is a flexible approach, the data can be interpreted in multiple ways and it can be hard to select what data is more important to emphasize. 

15. Narrative Analysis 

A bit more complex in nature than the two previous ones, narrative analysis is used to explore the meaning behind the stories that people tell and most importantly, how they tell them. By looking into the words that people use to describe a situation you can extract valuable conclusions about their perspective on a specific topic. Common sources for narrative data include autobiographies, family stories, opinion pieces, and testimonials, among others. 

From a business perspective, narrative analysis can be useful to analyze customer behaviors and feelings towards a specific product, service, feature, or others. It provides unique and deep insights that can be extremely valuable. However, it has some drawbacks.  

The biggest weakness of this method is that the sample sizes are usually very small due to the complexity and time-consuming nature of the collection of narrative data. Plus, the way a subject tells a story will be significantly influenced by his or her specific experiences, making it very hard to replicate in a subsequent study. 

16. Discourse Analysis

Discourse analysis is used to understand the meaning behind any type of written, verbal, or symbolic discourse based on its political, social, or cultural context. It mixes the analysis of languages and situations together. This means that the way the content is constructed and the meaning behind it is significantly influenced by the culture and society it takes place in. For example, if you are analyzing political speeches you need to consider different context elements such as the politician's background, the current political context of the country, the audience to which the speech is directed, and so on. 

From a business point of view, discourse analysis is a great market research tool. It allows marketers to understand how the norms and ideas of the specific market work and how their customers relate to those ideas. It can be very useful to build a brand mission or develop a unique tone of voice. 

17. Grounded Theory Analysis

Traditionally, researchers decide on a method and hypothesis and start to collect the data to prove that hypothesis. The grounded theory is the only method that doesn’t require an initial research question or hypothesis as its value lies in the generation of new theories. With the grounded theory method, you can go into the analysis process with an open mind and explore the data to generate new theories through tests and revisions. In fact, it is not necessary to collect the data and then start to analyze it. Researchers usually start to find valuable insights as they are gathering the data. 

All of these elements make grounded theory a very valuable method as theories are fully backed by data instead of initial assumptions. It is a great technique to analyze poorly researched topics or find the causes behind specific company outcomes. For example, product managers and marketers might use the grounded theory to find the causes of high levels of customer churn and look into customer surveys and reviews to develop new theories about the causes. 

How To Analyze Data? Top 17 Data Analysis Techniques To Apply

17 top data analysis techniques by datapine

Now that we’ve answered the questions “what is data analysis’”, why is it important, and covered the different data analysis types, it’s time to dig deeper into how to perform your analysis by working through these 17 essential techniques.

1. Collaborate your needs

Before you begin analyzing or drilling down into any techniques, it’s crucial to sit down collaboratively with all key stakeholders within your organization, decide on your primary campaign or strategic goals, and gain a fundamental understanding of the types of insights that will best benefit your progress or provide you with the level of vision you need to evolve your organization.

2. Establish your questions

Once you’ve outlined your core objectives, you should consider which questions will need answering to help you achieve your mission. This is one of the most important techniques as it will shape the very foundations of your success.

To help you ask the right things and ensure your data works for you, you have to ask the right data analysis questions .

3. Data democratization

After giving your data analytics methodology some real direction, and knowing which questions need answering to extract optimum value from the information available to your organization, you should continue with democratization.

Data democratization is an action that aims to connect data from various sources efficiently and quickly so that anyone in your organization can access it at any given moment. You can extract data in text, images, videos, numbers, or any other format. And then perform cross-database analysis to achieve more advanced insights to share with the rest of the company interactively.  

Once you have decided on your most valuable sources, you need to take all of this into a structured format to start collecting your insights. For this purpose, datapine offers an easy all-in-one data connectors feature to integrate all your internal and external sources and manage them at your will. Additionally, datapine’s end-to-end solution automatically updates your data, allowing you to save time and focus on performing the right analysis to grow your company.

data connectors from datapine

4. Think of governance 

When collecting data in a business or research context you always need to think about security and privacy. With data breaches becoming a topic of concern for businesses, the need to protect your client's or subject’s sensitive information becomes critical. 

To ensure that all this is taken care of, you need to think of a data governance strategy. According to Gartner , this concept refers to “ the specification of decision rights and an accountability framework to ensure the appropriate behavior in the valuation, creation, consumption, and control of data and analytics .” In simpler words, data governance is a collection of processes, roles, and policies, that ensure the efficient use of data while still achieving the main company goals. It ensures that clear roles are in place for who can access the information and how they can access it. In time, this not only ensures that sensitive information is protected but also allows for an efficient analysis as a whole. 

5. Clean your data

After harvesting from so many sources you will be left with a vast amount of information that can be overwhelming to deal with. At the same time, you can be faced with incorrect data that can be misleading to your analysis. The smartest thing you can do to avoid dealing with this in the future is to clean the data. This is fundamental before visualizing it, as it will ensure that the insights you extract from it are correct.

There are many things that you need to look for in the cleaning process. The most important one is to eliminate any duplicate observations; this usually appears when using multiple internal and external sources of information. You can also add any missing codes, fix empty fields, and eliminate incorrectly formatted data.

Another usual form of cleaning is done with text data. As we mentioned earlier, most companies today analyze customer reviews, social media comments, questionnaires, and several other text inputs. In order for algorithms to detect patterns, text data needs to be revised to avoid invalid characters or any syntax or spelling errors. 

Most importantly, the aim of cleaning is to prevent you from arriving at false conclusions that can damage your company in the long run. By using clean data, you will also help BI solutions to interact better with your information and create better reports for your organization.

6. Set your KPIs

Once you’ve set your sources, cleaned your data, and established clear-cut questions you want your insights to answer, you need to set a host of key performance indicators (KPIs) that will help you track, measure, and shape your progress in a number of key areas.

KPIs are critical to both qualitative and quantitative analysis research. This is one of the primary methods of data analysis you certainly shouldn’t overlook.

To help you set the best possible KPIs for your initiatives and activities, here is an example of a relevant logistics KPI : transportation-related costs. If you want to see more go explore our collection of key performance indicator examples .

Transportation costs logistics KPIs

7. Omit useless data

Having bestowed your data analysis tools and techniques with true purpose and defined your mission, you should explore the raw data you’ve collected from all sources and use your KPIs as a reference for chopping out any information you deem to be useless.

Trimming the informational fat is one of the most crucial methods of analysis as it will allow you to focus your analytical efforts and squeeze every drop of value from the remaining ‘lean’ information.

Any stats, facts, figures, or metrics that don’t align with your business goals or fit with your KPI management strategies should be eliminated from the equation.

8. Build a data management roadmap

While, at this point, this particular step is optional (you will have already gained a wealth of insight and formed a fairly sound strategy by now), creating a data governance roadmap will help your data analysis methods and techniques become successful on a more sustainable basis. These roadmaps, if developed properly, are also built so they can be tweaked and scaled over time.

Invest ample time in developing a roadmap that will help you store, manage, and handle your data internally, and you will make your analysis techniques all the more fluid and functional – one of the most powerful types of data analysis methods available today.

9. Integrate technology

There are many ways to analyze data, but one of the most vital aspects of analytical success in a business context is integrating the right decision support software and technology.

Robust analysis platforms will not only allow you to pull critical data from your most valuable sources while working with dynamic KPIs that will offer you actionable insights; it will also present them in a digestible, visual, interactive format from one central, live dashboard . A data methodology you can count on.

By integrating the right technology within your data analysis methodology, you’ll avoid fragmenting your insights, saving you time and effort while allowing you to enjoy the maximum value from your business’s most valuable insights.

For a look at the power of software for the purpose of analysis and to enhance your methods of analyzing, glance over our selection of dashboard examples .

10. Answer your questions

By considering each of the above efforts, working with the right technology, and fostering a cohesive internal culture where everyone buys into the different ways to analyze data as well as the power of digital intelligence, you will swiftly start to answer your most burning business questions. Arguably, the best way to make your data concepts accessible across the organization is through data visualization.

11. Visualize your data

Online data visualization is a powerful tool as it lets you tell a story with your metrics, allowing users across the organization to extract meaningful insights that aid business evolution – and it covers all the different ways to analyze data.

The purpose of analyzing is to make your entire organization more informed and intelligent, and with the right platform or dashboard, this is simpler than you think, as demonstrated by our marketing dashboard .

An executive dashboard example showcasing high-level marketing KPIs such as cost per lead, MQL, SQL, and cost per customer.

This visual, dynamic, and interactive online dashboard is a data analysis example designed to give Chief Marketing Officers (CMO) an overview of relevant metrics to help them understand if they achieved their monthly goals.

In detail, this example generated with a modern dashboard creator displays interactive charts for monthly revenues, costs, net income, and net income per customer; all of them are compared with the previous month so that you can understand how the data fluctuated. In addition, it shows a detailed summary of the number of users, customers, SQLs, and MQLs per month to visualize the whole picture and extract relevant insights or trends for your marketing reports .

The CMO dashboard is perfect for c-level management as it can help them monitor the strategic outcome of their marketing efforts and make data-driven decisions that can benefit the company exponentially.

12. Be careful with the interpretation

We already dedicated an entire post to data interpretation as it is a fundamental part of the process of data analysis. It gives meaning to the analytical information and aims to drive a concise conclusion from the analysis results. Since most of the time companies are dealing with data from many different sources, the interpretation stage needs to be done carefully and properly in order to avoid misinterpretations. 

To help you through the process, here we list three common practices that you need to avoid at all costs when looking at your data:

  • Correlation vs. causation: The human brain is formatted to find patterns. This behavior leads to one of the most common mistakes when performing interpretation: confusing correlation with causation. Although these two aspects can exist simultaneously, it is not correct to assume that because two things happened together, one provoked the other. A piece of advice to avoid falling into this mistake is never to trust just intuition, trust the data. If there is no objective evidence of causation, then always stick to correlation. 
  • Confirmation bias: This phenomenon describes the tendency to select and interpret only the data necessary to prove one hypothesis, often ignoring the elements that might disprove it. Even if it's not done on purpose, confirmation bias can represent a real problem, as excluding relevant information can lead to false conclusions and, therefore, bad business decisions. To avoid it, always try to disprove your hypothesis instead of proving it, share your analysis with other team members, and avoid drawing any conclusions before the entire analytical project is finalized.
  • Statistical significance: To put it in short words, statistical significance helps analysts understand if a result is actually accurate or if it happened because of a sampling error or pure chance. The level of statistical significance needed might depend on the sample size and the industry being analyzed. In any case, ignoring the significance of a result when it might influence decision-making can be a huge mistake.

13. Build a narrative

Now, we’re going to look at how you can bring all of these elements together in a way that will benefit your business - starting with a little something called data storytelling.

The human brain responds incredibly well to strong stories or narratives. Once you’ve cleansed, shaped, and visualized your most invaluable data using various BI dashboard tools , you should strive to tell a story - one with a clear-cut beginning, middle, and end.

By doing so, you will make your analytical efforts more accessible, digestible, and universal, empowering more people within your organization to use your discoveries to their actionable advantage.

14. Consider autonomous technology

Autonomous technologies, such as artificial intelligence (AI) and machine learning (ML), play a significant role in the advancement of understanding how to analyze data more effectively.

Gartner predicts that by the end of this year, 80% of emerging technologies will be developed with AI foundations. This is a testament to the ever-growing power and value of autonomous technologies.

At the moment, these technologies are revolutionizing the analysis industry. Some examples that we mentioned earlier are neural networks, intelligent alarms, and sentiment analysis.

15. Share the load

If you work with the right tools and dashboards, you will be able to present your metrics in a digestible, value-driven format, allowing almost everyone in the organization to connect with and use relevant data to their advantage.

Modern dashboards consolidate data from various sources, providing access to a wealth of insights in one centralized location, no matter if you need to monitor recruitment metrics or generate reports that need to be sent across numerous departments. Moreover, these cutting-edge tools offer access to dashboards from a multitude of devices, meaning that everyone within the business can connect with practical insights remotely - and share the load.

Once everyone is able to work with a data-driven mindset, you will catalyze the success of your business in ways you never thought possible. And when it comes to knowing how to analyze data, this kind of collaborative approach is essential.

16. Data analysis tools

In order to perform high-quality analysis of data, it is fundamental to use tools and software that will ensure the best results. Here we leave you a small summary of four fundamental categories of data analysis tools for your organization.

  • Business Intelligence: BI tools allow you to process significant amounts of data from several sources in any format. Through this, you can not only analyze and monitor your data to extract relevant insights but also create interactive reports and dashboards to visualize your KPIs and use them for your company's good. datapine is an amazing online BI software that is focused on delivering powerful online analysis features that are accessible to beginner and advanced users. Like this, it offers a full-service solution that includes cutting-edge analysis of data, KPIs visualization, live dashboards, reporting, and artificial intelligence technologies to predict trends and minimize risk.
  • Statistical analysis: These tools are usually designed for scientists, statisticians, market researchers, and mathematicians, as they allow them to perform complex statistical analyses with methods like regression analysis, predictive analysis, and statistical modeling. A good tool to perform this type of analysis is R-Studio as it offers a powerful data modeling and hypothesis testing feature that can cover both academic and general data analysis. This tool is one of the favorite ones in the industry, due to its capability for data cleaning, data reduction, and performing advanced analysis with several statistical methods. Another relevant tool to mention is SPSS from IBM. The software offers advanced statistical analysis for users of all skill levels. Thanks to a vast library of machine learning algorithms, text analysis, and a hypothesis testing approach it can help your company find relevant insights to drive better decisions. SPSS also works as a cloud service that enables you to run it anywhere.
  • SQL Consoles: SQL is a programming language often used to handle structured data in relational databases. Tools like these are popular among data scientists as they are extremely effective in unlocking these databases' value. Undoubtedly, one of the most used SQL software in the market is MySQL Workbench . This tool offers several features such as a visual tool for database modeling and monitoring, complete SQL optimization, administration tools, and visual performance dashboards to keep track of KPIs.
  • Data Visualization: These tools are used to represent your data through charts, graphs, and maps that allow you to find patterns and trends in the data. datapine's already mentioned BI platform also offers a wealth of powerful online data visualization tools with several benefits. Some of them include: delivering compelling data-driven presentations to share with your entire company, the ability to see your data online with any device wherever you are, an interactive dashboard design feature that enables you to showcase your results in an interactive and understandable way, and to perform online self-service reports that can be used simultaneously with several other people to enhance team productivity.

17. Refine your process constantly 

Last is a step that might seem obvious to some people, but it can be easily ignored if you think you are done. Once you have extracted the needed results, you should always take a retrospective look at your project and think about what you can improve. As you saw throughout this long list of techniques, data analysis is a complex process that requires constant refinement. For this reason, you should always go one step further and keep improving. 

Quality Criteria For Data Analysis

So far we’ve covered a list of methods and techniques that should help you perform efficient data analysis. But how do you measure the quality and validity of your results? This is done with the help of some science quality criteria. Here we will go into a more theoretical area that is critical to understanding the fundamentals of statistical analysis in science. However, you should also be aware of these steps in a business context, as they will allow you to assess the quality of your results in the correct way. Let’s dig in. 

  • Internal validity: The results of a survey are internally valid if they measure what they are supposed to measure and thus provide credible results. In other words , internal validity measures the trustworthiness of the results and how they can be affected by factors such as the research design, operational definitions, how the variables are measured, and more. For instance, imagine you are doing an interview to ask people if they brush their teeth two times a day. While most of them will answer yes, you can still notice that their answers correspond to what is socially acceptable, which is to brush your teeth at least twice a day. In this case, you can’t be 100% sure if respondents actually brush their teeth twice a day or if they just say that they do, therefore, the internal validity of this interview is very low. 
  • External validity: Essentially, external validity refers to the extent to which the results of your research can be applied to a broader context. It basically aims to prove that the findings of a study can be applied in the real world. If the research can be applied to other settings, individuals, and times, then the external validity is high. 
  • Reliability : If your research is reliable, it means that it can be reproduced. If your measurement were repeated under the same conditions, it would produce similar results. This means that your measuring instrument consistently produces reliable results. For example, imagine a doctor building a symptoms questionnaire to detect a specific disease in a patient. Then, various other doctors use this questionnaire but end up diagnosing the same patient with a different condition. This means the questionnaire is not reliable in detecting the initial disease. Another important note here is that in order for your research to be reliable, it also needs to be objective. If the results of a study are the same, independent of who assesses them or interprets them, the study can be considered reliable. Let’s see the objectivity criteria in more detail now. 
  • Objectivity: In data science, objectivity means that the researcher needs to stay fully objective when it comes to its analysis. The results of a study need to be affected by objective criteria and not by the beliefs, personality, or values of the researcher. Objectivity needs to be ensured when you are gathering the data, for example, when interviewing individuals, the questions need to be asked in a way that doesn't influence the results. Paired with this, objectivity also needs to be thought of when interpreting the data. If different researchers reach the same conclusions, then the study is objective. For this last point, you can set predefined criteria to interpret the results to ensure all researchers follow the same steps. 

The discussed quality criteria cover mostly potential influences in a quantitative context. Analysis in qualitative research has by default additional subjective influences that must be controlled in a different way. Therefore, there are other quality criteria for this kind of research such as credibility, transferability, dependability, and confirmability. You can see each of them more in detail on this resource . 

Data Analysis Limitations & Barriers

Analyzing data is not an easy task. As you’ve seen throughout this post, there are many steps and techniques that you need to apply in order to extract useful information from your research. While a well-performed analysis can bring various benefits to your organization it doesn't come without limitations. In this section, we will discuss some of the main barriers you might encounter when conducting an analysis. Let’s see them more in detail. 

  • Lack of clear goals: No matter how good your data or analysis might be if you don’t have clear goals or a hypothesis the process might be worthless. While we mentioned some methods that don’t require a predefined hypothesis, it is always better to enter the analytical process with some clear guidelines of what you are expecting to get out of it, especially in a business context in which data is utilized to support important strategic decisions. 
  • Objectivity: Arguably one of the biggest barriers when it comes to data analysis in research is to stay objective. When trying to prove a hypothesis, researchers might find themselves, intentionally or unintentionally, directing the results toward an outcome that they want. To avoid this, always question your assumptions and avoid confusing facts with opinions. You can also show your findings to a research partner or external person to confirm that your results are objective. 
  • Data representation: A fundamental part of the analytical procedure is the way you represent your data. You can use various graphs and charts to represent your findings, but not all of them will work for all purposes. Choosing the wrong visual can not only damage your analysis but can mislead your audience, therefore, it is important to understand when to use each type of data depending on your analytical goals. Our complete guide on the types of graphs and charts lists 20 different visuals with examples of when to use them. 
  • Flawed correlation : Misleading statistics can significantly damage your research. We’ve already pointed out a few interpretation issues previously in the post, but it is an important barrier that we can't avoid addressing here as well. Flawed correlations occur when two variables appear related to each other but they are not. Confusing correlations with causation can lead to a wrong interpretation of results which can lead to building wrong strategies and loss of resources, therefore, it is very important to identify the different interpretation mistakes and avoid them. 
  • Sample size: A very common barrier to a reliable and efficient analysis process is the sample size. In order for the results to be trustworthy, the sample size should be representative of what you are analyzing. For example, imagine you have a company of 1000 employees and you ask the question “do you like working here?” to 50 employees of which 49 say yes, which means 95%. Now, imagine you ask the same question to the 1000 employees and 950 say yes, which also means 95%. Saying that 95% of employees like working in the company when the sample size was only 50 is not a representative or trustworthy conclusion. The significance of the results is way more accurate when surveying a bigger sample size.   
  • Privacy concerns: In some cases, data collection can be subjected to privacy regulations. Businesses gather all kinds of information from their customers from purchasing behaviors to addresses and phone numbers. If this falls into the wrong hands due to a breach, it can affect the security and confidentiality of your clients. To avoid this issue, you need to collect only the data that is needed for your research and, if you are using sensitive facts, make it anonymous so customers are protected. The misuse of customer data can severely damage a business's reputation, so it is important to keep an eye on privacy. 
  • Lack of communication between teams : When it comes to performing data analysis on a business level, it is very likely that each department and team will have different goals and strategies. However, they are all working for the same common goal of helping the business run smoothly and keep growing. When teams are not connected and communicating with each other, it can directly affect the way general strategies are built. To avoid these issues, tools such as data dashboards enable teams to stay connected through data in a visually appealing way. 
  • Innumeracy : Businesses are working with data more and more every day. While there are many BI tools available to perform effective analysis, data literacy is still a constant barrier. Not all employees know how to apply analysis techniques or extract insights from them. To prevent this from happening, you can implement different training opportunities that will prepare every relevant user to deal with data. 

Key Data Analysis Skills

As you've learned throughout this lengthy guide, analyzing data is a complex task that requires a lot of knowledge and skills. That said, thanks to the rise of self-service tools the process is way more accessible and agile than it once was. Regardless, there are still some key skills that are valuable to have when working with data, we list the most important ones below.

  • Critical and statistical thinking: To successfully analyze data you need to be creative and think out of the box. Yes, that might sound like a weird statement considering that data is often tight to facts. However, a great level of critical thinking is required to uncover connections, come up with a valuable hypothesis, and extract conclusions that go a step further from the surface. This, of course, needs to be complemented by statistical thinking and an understanding of numbers. 
  • Data cleaning: Anyone who has ever worked with data before will tell you that the cleaning and preparation process accounts for 80% of a data analyst's work, therefore, the skill is fundamental. But not just that, not cleaning the data adequately can also significantly damage the analysis which can lead to poor decision-making in a business scenario. While there are multiple tools that automate the cleaning process and eliminate the possibility of human error, it is still a valuable skill to dominate. 
  • Data visualization: Visuals make the information easier to understand and analyze, not only for professional users but especially for non-technical ones. Having the necessary skills to not only choose the right chart type but know when to apply it correctly is key. This also means being able to design visually compelling charts that make the data exploration process more efficient. 
  • SQL: The Structured Query Language or SQL is a programming language used to communicate with databases. It is fundamental knowledge as it enables you to update, manipulate, and organize data from relational databases which are the most common databases used by companies. It is fairly easy to learn and one of the most valuable skills when it comes to data analysis. 
  • Communication skills: This is a skill that is especially valuable in a business environment. Being able to clearly communicate analytical outcomes to colleagues is incredibly important, especially when the information you are trying to convey is complex for non-technical people. This applies to in-person communication as well as written format, for example, when generating a dashboard or report. While this might be considered a “soft” skill compared to the other ones we mentioned, it should not be ignored as you most likely will need to share analytical findings with others no matter the context. 

Data Analysis In The Big Data Environment

Big data is invaluable to today’s businesses, and by using different methods for data analysis, it’s possible to view your data in a way that can help you turn insight into positive action.

To inspire your efforts and put the importance of big data into context, here are some insights that you should know:

  • By 2026 the industry of big data is expected to be worth approximately $273.4 billion.
  • 94% of enterprises say that analyzing data is important for their growth and digital transformation. 
  • Companies that exploit the full potential of their data can increase their operating margins by 60% .
  • We already told you the benefits of Artificial Intelligence through this article. This industry's financial impact is expected to grow up to $40 billion by 2025.

Data analysis concepts may come in many forms, but fundamentally, any solid methodology will help to make your business more streamlined, cohesive, insightful, and successful than ever before.

Key Takeaways From Data Analysis 

As we reach the end of our data analysis journey, we leave a small summary of the main methods and techniques to perform excellent analysis and grow your business.

17 Essential Types of Data Analysis Methods:

  • Cluster analysis
  • Cohort analysis
  • Regression analysis
  • Factor analysis
  • Neural Networks
  • Data Mining
  • Text analysis
  • Time series analysis
  • Decision trees
  • Conjoint analysis 
  • Correspondence Analysis
  • Multidimensional Scaling 
  • Content analysis 
  • Thematic analysis
  • Narrative analysis 
  • Grounded theory analysis
  • Discourse analysis 

Top 17 Data Analysis Techniques:

  • Collaborate your needs
  • Establish your questions
  • Data democratization
  • Think of data governance 
  • Clean your data
  • Set your KPIs
  • Omit useless data
  • Build a data management roadmap
  • Integrate technology
  • Answer your questions
  • Visualize your data
  • Interpretation of data
  • Consider autonomous technology
  • Build a narrative
  • Share the load
  • Data Analysis tools
  • Refine your process constantly 

We’ve pondered the data analysis definition and drilled down into the practical applications of data-centric analytics, and one thing is clear: by taking measures to arrange your data and making your metrics work for you, it’s possible to transform raw information into action - the kind of that will push your business to the next level.

Yes, good data analytics techniques result in enhanced business intelligence (BI). To help you understand this notion in more detail, read our exploration of business intelligence reporting .

And, if you’re ready to perform your own analysis, drill down into your facts and figures while interacting with your data on astonishing visuals, you can try our software for a free, 14-day trial .

what is data analysis procedures for qualitative research

No products in the cart.

What is Qualitative Data Analysis Software (QDA Software)?

what is data analysis procedures for qualitative research

Qualitative Data Analysis Software (QDA software) allows researchers to organize, analyze and visualize their data, finding the patterns in qualitative data or unstructured data: interviews, surveys, field notes, videos, audio files, images, journal articles interviews, web content etc.

Quantitative vs. Qualitative Data Analysis

What is the difference between quantitative and qualitative data analysis. As the name implies, quantitative data analysis has to do with numbers. For example, any time you are doing statistical analysis, you are doing quantitative data analysis. Some examples of quantitative data analysis software are SPSS, STATA, SAS, and Lumivero’s own powerful statistics software, XLSTAT .

In contrast, qualitative analysis "helps you understand people’s perceptions and experiences by systematically coding and analyzing the data", as described in Qualitative vs Quantitative Research 101 . It tends to deal more with words than numbers. It can be useful when working with a lot of rich and deep data and when you aren’t trying to test something very specific. Some examples of qualitative data analysis software are MAXQDA, ATLAS.ti, Quirkos, and Lumivero’s NVivo, the leading tool for qualitative data analysis .

When would you use each one? Well, qualitative data analysis is often used for exploratory research or developing a theory, whereas quantitative is better if you want to test a hypothesis, find averages, and determine relationships between variables. With quantitative research you often want a large sample size to get relevant statistics. In contrast, qualitative research, because so much data in the form of text is involved, can have much smaller sample sizes and still yield valuable insights.

Of course, it’s not always so cut and dry, and many researchers end up taking a «mixed methods» approach, meaning that they combine both types of research. In this case they might use a combination of both types of software programs.

Learn how some qualitative researchers use QDA software for text analysis in the on-demand webinar Twenty-Five Qualitative Researchers Share How-To's for Data Analysis .

NVivo Demo Request

How is Qualitative Data Analysis Software Used for Research?

Qualitative Data Analysis Software works with any qualitative research methodology used by a researcher For example, software for qualitative data analysis can be used by a social scientist wanting to develop new concepts or theories may take a ‘grounded theory’ approach. Or a researcher looking for ways to improve health policy or program design might use ‘evaluation methods’. QDA software analysis tools don't favor a particular methodology — they're designed to facilitate common qualitative techniques no matter what method you use.

NVivo can help you to manage, explore and find patterns in your data and conduct thematic and sentiment analysis, but it cannot replace your analytical expertise.

Qualitative Research as an Iterative Process

Handling qualitative and mixed methods data is not usually a step-by-step process. Instead, it tends to be an iterative process where you explore, code, reflect, memo, code some more, query and so on. For example, this picture shows a path you might take to investigate an interesting theme using QDA software, like NVivo, to analyze data:

what is data analysis procedures for qualitative research

How Do I Choose the Best Approach for My Research Project with QDA Software?

Every research project is unique — the way you organize and analyze the material depends on your methodology, data and research design.

Here are some example scenarios for handling different types of research projects in QDA software — these are just suggestions to get you up and running.

A study with interviews exploring stakeholder perception of a community arts program

Your files consist of unstructured interview documents. You would set up a case for each interview participant, then code to codes and cases. You could then explore your data with simple queries or charts and use memos to record your discoveries.

what is data analysis procedures for qualitative research

A study exploring community perceptions about climate change using autocoding with AI

Your files consist of structured, consistently formatted interviews (where each participant is asked the same set of questions). With AI, you could autocode the interviews and set up cases for each participant. Then code themes to query and visualize your data.

what is data analysis procedures for qualitative research

A literature review on adolescent depression

Your files consist of journal articles, books and web pages. You would classify your files before coding and querying them; and then you could critique each file in a memo. With Citavi integration in NVivo, you can import your Citavi references into NVivo.

what is data analysis procedures for qualitative research

A social media study of the language used by members of an online community

Your files consist of Facebook data captured with NCapture. You would import it as a dataset ready to code and query. Use memos to record your insights.

what is data analysis procedures for qualitative research

A quick analysis of a local government budget survey

Your file is a large dataset of survey responses. You would import it using the Survey Import Wizard, which prepares your data for analysis. As part of the import, choose to run automated insights with AI to identify and code to themes and sentiment so that you can quickly review results and report broad findings.

what is data analysis procedures for qualitative research

Ways to Get Started with Your Project with Qualitative Analysis Software

Since projects (and researchers) are unique there is no one 'best practice' approach to organizing and analyzing your data but there are some useful strategies to help you get up and running:

  • Start now - don't wait until you have collected all the data. Import your research design, grant application or thesis proposal.
  • Make a  project journa l and state your research questions and record your goals. Why are you doing the project? What is it about? What do you expect to find and why?
  • Make a  mind map  for your preliminary ideas. Show the relationships or patterns you expect to find in your data based on prior experience or preliminary reading.
  • Import your interviews, field notes, focus groups —organize these files into folders for easy access.
  • Set up an initial code structure based on your early reading and ideas—you could run a  Word Frequency query over your data to tease out the common themes for creating your code structure.
  • Set up  cases  for the people, places or other cases in your project.
  • Explore your material and  code themes as they emerge in your data mining —creating memos and describing your discoveries and interpretations.
  • To protect your work, get in the habit of making regular back-ups.

QDA Analysis Tools Help You Work Toward Outcomes that are Robust and Transparent

Using QDA software to organize and analyze your data also increases the 'transparency' of your research outcomes—for example, you can:

  • Demonstrate the evolution of your ideas in memos and maps.
  • Document your early preconceptions and biases (in a memo or map) and demonstrate how these have been acknowledged and tested.
  • Easily find illustrative quotes.
  • Always return to the original context of your coded material.
  • Save and revisit the queries and visualizations that helped you to arrive at your conclusions.

QDA software, like NVivo, can demonstrate the credibility of your findings in the following ways:

  • If you used NVivo for your literature review, run a  query  or create a  chart  to demonstrate how your findings compare with the views of other authors.
  • Was an issue or theme reported by more than one participant? Run a  Matrix Coding query  to see how many participants talked about a theme.
  • Were multiple methods used to collect the data (interviews, observations, surveys)—and are the findings supported across these text data and video data files? Run a Matrix Coding query to see how often a theme is reported across all your files.

what is data analysis procedures for qualitative research

  • If multiple researchers analyzed the material — were their findings consistent? Use coding stripes (or filter the contents in a code) to see how various team members have coded the material and run a Coding Comparison query to assess the level of agreement.

what is data analysis procedures for qualitative research

QDA Software Integrations

Many qualitative analysis software options have integration with other software to enhance your research process. NVivo integrates or can be used with the following software:

  • NVivo Transcription to save you time and jump start your qualitative data analysis. Learn how in the on-demand webinar Transcription – Go Beyond the Words .
  • Reference management software, like Lumivero’s Citavi, for reference management and writing. By combining Citavi and NVivo, you can create complicated searches for certain keywords, terms, and categories and make advanced search syntax, like wildcards, boolean operators, and regular expressions. This integration allows you to take your analyses beyond reference management by developing a central location to collect references and thoughts, analyze literature, and connect empirical data.
  • Statistical software, like Lumivero’s XLSTAT , SPSS, or STATA to export your queries from NVivo to run statistical analysis
  • Qualtrics, SurveyMonkey to import your survey results into NVivo to start analyzing.

Make Choosing QDA Software Easy —  Try NVivo Today!

It's tough choosing QDA software! Test out NVivo, the most cited qualitative data analysis tool, by requesting a free 14-day trial of NVivo to start improving your qualitative and mixed methods research today.

Recent Articles

  • Python For Data Analysis
  • Data Science
  • Data Analysis with R
  • Data Analysis with Python
  • Data Visualization with Python
  • Data Analysis Examples
  • Math for Data Analysis
  • Data Analysis Interview questions
  • Artificial Intelligence
  • Data Analysis Projects
  • Machine Learning
  • Deep Learning
  • Computer Vision
  • What is Data Analytics?
  • What is Statistical Analysis in Data Science?
  • What Is Spatial Analysis, and How Does It Work
  • What is Data Analysis?
  • What is Data Munging in Analysis?
  • What is Geospatial Data Analysis?
  • What is Exploratory Data Analysis ?
  • Qualitative and Quantitative Data
  • What are Descriptive Analytics?
  • What is Prescriptive Analytics in Data Science?
  • Qualitative Data
  • Data Analysis Tutorial
  • Why Data Visualization Matters in Data Analytics?
  • What is Data Lineage?
  • Data analysis using R
  • What is Data Organization?
  • What is a Data Science Platform?

What is Qualitative Data Analysis?

Understanding qualitative information analysis is important for researchers searching for to uncover nuanced insights from non-numerical statistics. By exploring qualitative statistics evaluation, you can still draw close its importance in studies, understand its methodologies, and determine while and the way to apply it successfully to extract meaningful insights from qualitative records.

The article goals to provide a complete manual to expertise qualitative records evaluation, masking its significance, methodologies, steps, advantages, disadvantages, and applications.

What-is-Qualitative-Data-Analysis

Table of Content

Understanding Qualitative Data Analysis

Importance of qualitative data analysis, steps to perform qualitative data analysis, 1. craft clear research questions, 2. gather rich customer insights, 3. organize and categorize data, 4. uncover themes and patterns : coding, 5. make hypotheses and validating, methodologies in qualitative data analysis, advantages of qualitative data analysis, disadvantages of qualitative data analysis, when qualitative data analysis is used, applications of qualitative data analysis.

Qualitative data analysis is the process of systematically examining and deciphering qualitative facts (such as textual content, pix, motion pictures, or observations) to discover patterns, themes, and meanings inside the statistics· Unlike quantitative statistics evaluation, which focuses on numerical measurements and statistical strategies, qualitative statistics analysis emphasizes know-how the context, nuances, and subjective views embedded inside the information.

Qualitative facts evaluation is crucial because it is going past the bloodless hard information and numbers to provide a richer expertise of why and the way things appear. Qualitative statistics analysis is important for numerous motives:

  • Understanding Complexity and unveils the “Why” : Quantitative facts tells you “what” came about (e· g·, sales figures), however qualitative evaluation sheds light on the motives in the back of it (e·g·, consumer comments on product features).
  • Contextual Insight : Numbers don’t exist in a vacuum. Qualitative information affords context to quantitative findings, making the bigger photo clearer· Imagine high customer churn – interviews would possibly monitor lacking functionalities or perplexing interfaces.
  • Uncovers Emotions and Opinions: Qualitative records faucets into the human element· Surveys with open ended questions or awareness companies can display emotions, critiques, and motivations that can’t be captured by using numbers on my own.
  • Informs Better Decisions: By understanding the “why” and the “how” at the back of customer behavior or employee sentiment, companies can make greater knowledgeable decisions about product improvement, advertising techniques, and internal techniques.
  • Generates New Ideas : Qualitative analysis can spark clean thoughts and hypotheses· For example, via analyzing consumer interviews, commonplace subject matters may emerge that cause totally new product features.
  • Complements Quantitative Data : While each facts sorts are precious, they paintings quality collectively· Imagine combining website site visitors records (quantitative) with person comments (qualitative) to apprehend user revel in on a particular webpage.

In essence, qualitative data evaluation bridges the gap among the what and the why, providing a nuanced know-how that empowers better choice making·

Steps-to-Perform-Qualitative-Data-Analysis

Qualitative data analysis process, follow the structure in below steps:

Qualitative information evaluation procedure, comply with the shape in underneath steps:

Before diving into evaluation, it is critical to outline clear and particular studies questions. These questions ought to articulate what you want to study from the data and manual your analysis towards actionable insights. For instance, asking “How do employees understand the organizational culture inside our agency?” helps makes a speciality of know-how personnel’ perceptions of the organizational subculture inside a selected business enterprise. By exploring employees’ perspectives, attitudes, and stories related to organizational tradition, researchers can find valuable insights into workplace dynamics, communication patterns, management patterns, and worker delight degrees.

There are numerous methods to acquire qualitative information, each offering specific insights into client perceptions and reviews.

  • User Feedback: In-app surveys, app rankings, and social media feedback provide direct remarks from users approximately their studies with the products or services.
  • In-Depth Interviews : One-on-one interviews allow for deeper exploration of particular topics and offer wealthy, special insights into individuals’ views and behaviors.
  • Focus Groups : Facilitating group discussions allows the exploration of numerous viewpoints and permits individuals to construct upon every different’s ideas.
  • Review Sites : Analyzing purchaser critiques on systems like Amazon, Yelp, or app shops can monitor not unusual pain points, pride levels, and areas for improvement.
  • NPS Follow-Up Questions : Following up on Net Promoter Score (NPS) surveys with open-ended questions allows customers to elaborate on their rankings and provides qualitative context to quantitative ratings.

Efficient facts below is crucial for powerful analysis and interpretation.

  • Centralize: Gather all qualitative statistics, along with recordings, notes, and transcripts, right into a valuable repository for smooth get admission to and control.
  • Categorize through Research Question : Group facts primarily based at the specific studies questions they deal with. This organizational structure allows maintain consciousness in the course of analysis and guarantees that insights are aligned with the research objectives.

Coding is a scientific manner of assigning labels or categories to segments of qualitative statistics to uncover underlying issues and patterns.

  • Theme Identification : Themes are overarching principles or ideas that emerge from the records· During coding, researchers perceive and label segments of statistics that relate to those themes, bearing in mind the identification of vital principles in the dataset.
  • Pattern Detection : Patterns seek advice from relationships or connections between exceptional elements in the statistics. By reading coded segments, researchers can locate trends, repetitions, or cause-and-effect relationships, imparting deeper insights into patron perceptions and behaviors.

Based on the identified topics and styles, researchers can formulate hypotheses and draw conclusions about patron experiences and choices.

  • Hypothesis Formulation: Hypotheses are tentative causes or predictions based on found styles within the information. Researchers formulate hypotheses to provide an explanation for why certain themes or styles emerge and make predictions approximately their effect on patron behavior.
  • Validation : Researchers validate hypotheses by means of segmenting the facts based on one-of-a-kind standards (e.g., demographic elements, usage patterns) and analyzing variations or relationships inside the records. This procedure enables enhance the validity of findings and offers proof to assist conclusions drawn from qualitative evaluation.

There are five common methodologies utilized in Qualitative Data Analysis·

  • Thematic Analysis : Thematic Analysis involves systematically figuring out and reading habitual subject matters or styles within qualitative statistics. Researchers begin with the aid of coding the facts, breaking it down into significant segments, and then categorizing these segments based on shared traits. Through iterative analysis, themes are advanced and refined, permitting researchers to benefit insight into the underlying phenomena being studied.
  • Content Analysis: Content Analysis focuses on reading textual information to pick out and quantify particular styles or issues. Researchers code the statistics primarily based on predefined classes or subject matters, taking into consideration systematic agency and interpretation of the content. By analyzing how frequently positive themes occur and the way they’re represented inside the data, researchers can draw conclusions and insights relevant to their research objectives.
  • Narrative Analysis: Narrative Analysis delves into the narrative or story within qualitative statistics, that specialize in its structure, content, and meaning. Researchers examine the narrative to understand its context and attitude, exploring how individuals assemble and speak their reports thru storytelling. By analyzing the nuances and intricacies of the narrative, researchers can find underlying issues and advantage a deeper know-how of the phenomena being studied.
  • Grounded Theory : Grounded Theory is an iterative technique to growing and checking out theoretical frameworks primarily based on empirical facts. Researchers gather, code, and examine information without preconceived hypotheses, permitting theories to emerge from the information itself. Through constant assessment and theoretical sampling, researchers validate and refine theories, main to a deeper knowledge of the phenomenon under investigation.
  • Phenomenological Analysis : Phenomenological Analysis objectives to discover and recognize the lived stories and views of people. Researchers analyze and interpret the meanings, essences, and systems of these reviews, figuring out not unusual topics and styles across individual debts. By immersing themselves in members’ subjective stories, researchers advantage perception into the underlying phenomena from the individuals’ perspectives, enriching our expertise of human behavior and phenomena.
  • Richness and Depth: Qualitative records evaluation lets in researchers to discover complex phenomena intensive, shooting the richness and complexity of human stories, behaviors, and social processes.
  • Flexibility : Qualitative techniques offer flexibility in statistics collection and evaluation, allowing researchers to conform their method based on emergent topics and evolving studies questions.
  • Contextual Understanding: Qualitative evaluation presents perception into the context and meaning of information, helping researchers recognize the social, cultural, and historic elements that form human conduct and interactions.
  • Subjective Perspectives : Qualitative methods allow researchers to explore subjective perspectives, beliefs, and reviews, offering a nuanced know-how of people’ mind, emotions, and motivations.
  • Theory Generation : Qualitative information analysis can cause the generation of recent theories or hypotheses, as researchers uncover patterns, themes, and relationships in the records that might not were formerly recognized.
  • Subjectivity: Qualitative records evaluation is inherently subjective, as interpretations can be stimulated with the aid of researchers’ biases, views, and preconceptions .
  • Time-Intensive : Qualitative records analysis may be time-consuming, requiring giant data collection, transcription, coding, and interpretation.
  • Generalizability: Findings from qualitative studies might not be effortlessly generalizable to larger populations, as the focus is often on know-how unique contexts and reviews in preference to making statistical inferences.
  • Validity and Reliability : Ensuring the validity and reliability of qualitative findings may be difficult, as there are fewer standardized methods for assessing and establishing rigor in comparison to quantitative studies.
  • Data Management : Managing and organizing qualitative information, together with transcripts, subject notes, and multimedia recordings, can be complicated and require careful documentation and garage.
  • Exploratory Research: Qualitative records evaluation is nicely-suited for exploratory studies, wherein the aim is to generate hypotheses, theories, or insights into complex phenomena.
  • Understanding Context : Qualitative techniques are precious for knowledge the context and which means of statistics, in particular in studies wherein social, cultural, or ancient factors are vital.
  • Subjective Experiences : Qualitative evaluation is good for exploring subjective stories, beliefs, and views, providing a deeper knowledge of people’ mind, feelings, and behaviors.
  • Complex Phenomena: Qualitative strategies are effective for studying complex phenomena that can not be effortlessly quantified or measured, allowing researchers to seize the richness and depth of human stories and interactions.
  • Complementary to Quantitative Data: Qualitative information analysis can complement quantitative research by means of offering context, intensity, and insight into the meanings at the back of numerical statistics, enriching our knowledge of studies findings.
  • Social Sciences: Qualitative information analysis is widely utilized in social sciences to apprehend human conduct, attitudes, and perceptions. Researchers employ qualitative methods to delve into the complexities of social interactions, cultural dynamics, and societal norms. By analyzing qualitative records which include interviews, observations, and textual resources, social scientists benefit insights into the elaborate nuances of human relationships, identity formation, and societal structures.
  • Psychology : In psychology, qualitative data evaluation is instrumental in exploring and deciphering person reports, emotions, and motivations. Qualitative methods along with in-depth interviews, cognizance businesses, and narrative evaluation allow psychologists to delve deep into the subjective stories of individuals. This approach facilitates discover underlying meanings, beliefs, and emotions, dropping light on psychological processes, coping mechanisms, and personal narratives.
  • Anthropology : Anthropologists use qualitative records evaluation to look at cultural practices, ideals, and social interactions inside various groups and societies. Through ethnographic research strategies such as player statement and interviews, anthropologists immerse themselves within the cultural contexts of different agencies. Qualitative analysis permits them to find the symbolic meanings, rituals, and social systems that form cultural identification and behavior.
  • Qualitative Market Research : In the sphere of marketplace research, qualitative statistics analysis is vital for exploring consumer options, perceptions, and behaviors. Qualitative techniques which include consciousness groups, in-depth interviews, and ethnographic research permit marketplace researchers to gain a deeper understanding of customer motivations, choice-making methods, and logo perceptions· By analyzing qualitative facts, entrepreneurs can identify emerging developments, discover unmet wishes, and tell product development and advertising and marketing techniques.
  • Healthcare: Qualitative statistics analysis plays a important function in healthcare studies via investigating patient experiences, delight, and healthcare practices. Researchers use qualitative techniques which includes interviews, observations, and patient narratives to explore the subjective reviews of people inside healthcare settings. Qualitative evaluation helps find affected person perspectives on healthcare services, treatment consequences, and pleasant of care, facilitating enhancements in patient-targeted care delivery and healthcare policy.

Qualitative data evaluation offers intensity, context, and know-how to investigate endeavors, enabling researchers to find wealthy insights and discover complicated phenomena via systematic examination of non-numerical information.

Please Login to comment...

Similar reads.

  • Data Analysis

Improve your Coding Skills with Practice

 alt=

What kind of Experience do you want to share?

  • Open access
  • Published: 13 May 2024

What are the strengths and limitations to utilising creative methods in public and patient involvement in health and social care research? A qualitative systematic review

  • Olivia R. Phillips 1 , 2   na1 ,
  • Cerian Harries 2 , 3   na1 ,
  • Jo Leonardi-Bee 1 , 2 , 4   na1 ,
  • Holly Knight 1 , 2 ,
  • Lauren B. Sherar 2 , 3 ,
  • Veronica Varela-Mato 2 , 3 &
  • Joanne R. Morling 1 , 2 , 5  

Research Involvement and Engagement volume  10 , Article number:  48 ( 2024 ) Cite this article

103 Accesses

2 Altmetric

Metrics details

There is increasing interest in using patient and public involvement (PPI) in research to improve the quality of healthcare. Ordinarily, traditional methods have been used such as interviews or focus groups. However, these methods tend to engage a similar demographic of people. Thus, creative methods are being developed to involve patients for whom traditional methods are inaccessible or non-engaging.

To determine the strengths and limitations to using creative PPI methods in health and social care research.

Electronic searches were conducted over five databases on 14th April 2023 (Web of Science, PubMed, ASSIA, CINAHL, Cochrane Library). Studies that involved traditional, non-creative PPI methods were excluded. Creative PPI methods were used to engage with people as research advisors, rather than study participants. Only primary data published in English from 2009 were accepted. Title, abstract and full text screening was undertaken by two independent reviewers before inductive thematic analysis was used to generate themes.

Twelve papers met the inclusion criteria. The creative methods used included songs, poems, drawings, photograph elicitation, drama performance, visualisations, social media, photography, prototype development, cultural animation, card sorting and persona development. Analysis identified four limitations and five strengths to the creative approaches. Limitations included the time and resource intensive nature of creative PPI, the lack of generalisation to wider populations and ethical issues. External factors, such as the lack of infrastructure to support creative PPI, also affected their implementation. Strengths included the disruption of power hierarchies and the creation of a safe space for people to express mundane or “taboo” topics. Creative methods are also engaging, inclusive of people who struggle to participate in traditional PPI and can also be cost and time efficient.

‘Creative PPI’ is an umbrella term encapsulating many different methods of engagement and there are strengths and limitations to each. The choice of which should be determined by the aims and requirements of the research, as well as the characteristics of the PPI group and practical limitations. Creative PPI can be advantageous over more traditional methods, however a hybrid approach could be considered to reap the benefits of both. Creative PPI methods are not widely used; however, this could change over time as PPI becomes embedded even more into research.

Plain English Summary

It is important that patients and public are included in the research process from initial brainstorming, through design to delivery. This is known as public and patient involvement (PPI). Their input means that research closely aligns with their wants and needs. Traditionally to get this input, interviews and group discussions are held, but this can exclude people who find these activities non-engaging or inaccessible, for example those with language challenges, learning disabilities or memory issues. Creative methods of PPI can overcome this. This is a broad term describing different (non-traditional) ways of engaging patients and public in research, such as through the use or art, animation or performance. This review investigated the reasons why creative approaches to PPI could be difficult (limitations) or helpful (strengths) in health and social care research. After searching 5 online databases, 12 studies were included in the review. PPI groups included adults, children and people with language and memory impairments. Creative methods included songs, poems, drawings, the use of photos and drama, visualisations, Facebook, creating prototypes, personas and card sorting. Limitations included the time, cost and effort associated with creative methods, the lack of application to other populations, ethical issues and buy-in from the wider research community. Strengths included the feeling of equality between academics and the public, creation of a safe space for people to express themselves, inclusivity, and that creative PPI can be cost and time efficient. Overall, this review suggests that creative PPI is worthwhile, however each method has its own strengths and limitations and the choice of which will depend on the research project, PPI group characteristics and other practical limitations, such as time and financial constraints.

Peer Review reports

Introduction

Patient and public involvement (PPI) is the term used to describe the partnership between patients (including caregivers, potential patients, healthcare users etc.) or the public (a community member with no known interest in the topic) with researchers. It describes research that is done “‘with’ or ‘by’ the public, rather than ‘to,’ ‘about’ or ‘for’ them” [ 1 ]. In 2009, it became a legislative requirement for certain health and social care organisations to include patients, families, carers and communities in not only the planning of health and social care services, but the commissioning, delivery and evaluation of them too [ 2 ]. For example, funding applications for the National Institute of Health and Care Research (NIHR), a UK funding body, mandates a demonstration of how researchers plan to include patients/service users, the public and carers at each stage of the project [ 3 ]. However, this should not simply be a tokenistic, tick-box exercise. PPI should help formulate initial ideas and should be an instrumental, continuous part of the research process. Input from PPI can provide unique insights not yet considered and can ensure that research and health services are closely aligned to the needs and requirements of service users PPI also generally makes research more relevant with clearer outcomes and impacts [ 4 ]. Although this review refers to both patients and the public using the umbrella term ‘PPI’, it is important to acknowledge that these are two different groups with different motivations, needs and interests when it comes to health research and service delivery [ 5 ].

Despite continuing recognition of the need of PPI to improve quality of healthcare, researchers have also recognised that there is no ‘one size fits all’ method for involving patients [ 4 ]. Traditionally, PPI methods invite people to take part in interviews or focus groups to facilitate discussion, or surveys and questionnaires. However, these can sometimes be inaccessible or non-engaging for certain populations. For example, someone with communication difficulties may find it difficult to engage in focus groups or interviews. If individuals lack the appropriate skills to interact in these types of scenarios, they cannot take advantage of the participation opportunities it can provide [ 6 ]. Creative methods, however, aim to resolve these issues. These are a relatively new concept whereby researchers use creative methods (e.g., artwork, animations, Lego), to make PPI more accessible and engaging for those whose voices would otherwise go unheard. They ensure that all populations can engage in research, regardless of their background or skills. Seminal work has previously been conducted in this area, which brought to light the use of creative methodologies in research. Leavy (2008) [ 7 ] discussed how traditional interviews had limits on what could be expressed due to their sterile, jargon-filled and formulaic structure, read by only a few specialised academics. It was this that called for more creative approaches, which included narrative enquiry, fiction-based research, poetry, music, dance, art, theatre, film and visual art. These practices, which can be used in any stage of the research cycle, supported greater empathy, self-reflection and longer-lasting learning experiences compared to interviews [ 7 ]. They also pushed traditional academic boundaries, which made the research accessible not only to researchers, but the public too. Leavy explains that there are similarities between arts-based approaches and scientific approaches: both attempts to investigate what it means to be human through exploration, and used together, these complimentary approaches can progress our understanding of the human experience [ 7 ]. Further, it is important to acknowledge the parallels and nuances between creative and inclusive methods of PPI. Although creative methods aim to be inclusive (this should underlie any PPI activity, whether creative or not), they do not incorporate all types of accessible, inclusive methodologies e.g., using sign language for people with hearing impairments or audio recordings for people who cannot read. Given that there was not enough scope to include an evaluation of all possible inclusive methodologies, this review will focus on creative methods of PPI only.

We aimed to conduct a qualitative systematic review to highlight the strengths of creative PPI in health and social care research, as well as the limitations, which might act as a barrier to their implementation. A qualitative systematic review “brings together research on a topic, systematically searching for research evidence from primary qualitative studies and drawing the findings together” [ 8 ]. This review can then advise researchers of the best practices when designing PPI.

Public involvement

The PHIRST-LIGHT Public Advisory Group (PAG) consists of a team of experienced public contributors with a diverse range of characteristics from across the UK. The PAG was involved in the initial question setting and study design for this review.

Search strategy

For the purpose of this review, the JBI approach for conducting qualitative systematic reviews was followed [ 9 ]. The search terms were (“creativ*” OR “innovat*” OR “authentic” OR “original” OR “inclu*”) AND (“public and patient involvement” OR “patient and public involvement” OR “public and patient involvement and engagement” OR “patient and public involvement and engagement” OR “PPI” OR “PPIE” OR “co-produc*” OR “co-creat*” OR “co-design*” OR “cooperat*” OR “co-operat*”). This search string was modified according to the requirements of each database. Papers were filtered by title, abstract and keywords (see Additional file 1 for search strings). The databases searched included Web of Science (WoS), PubMed, ASSIA and CINAHL. The Cochrane Library was also searched to identify relevant reviews which could lead to the identification of primary research. The search was conducted on 14/04/23. As our aim was to report on the use of creative PPI in research, rather than more generic public engagement, we used electronic databases of scholarly peer-reviewed literature, which represent a wide range of recognised databases. These identified studies published in general international journals (WoS, PubMed), those in social sciences journals (ASSIA), those in nursing and allied health journals (CINAHL), and trials of interventions (Cochrane Library).

Inclusion criteria

Only full-text, English language, primary research papers from 2009 to 2023 were included. This was the chosen timeframe as in 2009 the Health and Social Reform Act made it mandatory for certain Health and Social Care organisations to involve the public and patients in planning, delivering, and evaluating services [ 2 ]. Only creative methods of PPI were accepted, rather than traditional methods, such as interviews or focus groups. For the purposes of this paper, creative PPI included creative art or arts-based approaches (e.g., e.g. stories, songs, drama, drawing, painting, poetry, photography) to enhance engagement. Titles were related to health and social care and the creative PPI was used to engage with people as research advisors, not as study participants. Meta-analyses, conference abstracts, book chapters, commentaries and reviews were excluded. There were no limits concerning study location or the demographic characteristics of the PPI groups. Only qualitative data were accepted.

Quality appraisal

Quality appraisal using the Critical Appraisal Skills Programme (CASP) checklist [ 10 ] was conducted by the primary authors (ORP and CH). This was done independently, and discrepancies were discussed and resolved. If a consensus could not be reached, a third independent reviewer was consulted (JRM). The full list of quality appraisal questions can be found in Additional file 2 .

Data extraction

ORP extracted the study characteristics and a subset of these were checked by CH. Discrepancies were discussed and amendments made. Extracted data included author, title, location, year of publication, year study was carried out, research question/aim, creative methods used, number of participants, mean age, gender, ethnicity of participants, setting, limitations and strengths of creative PPI and main findings.

Data analysis

The included studies were analysed using inductive thematic analysis [ 11 ], where themes were determined by the data. The familiarisation stage took place during full-text reading of the included articles. Anything identified as a strength or limitation to creative PPI methods was extracted verbatim as an initial code and inputted into the data extraction Excel sheet. Similar codes were sorted into broader themes, either under ‘strengths’ or ‘limitations’ and reviewed. Themes were then assigned a name according to the codes.

The search yielded 9978 titles across the 5 databases: Web of Science (1480 results), PubMed (94 results), ASSIA (2454 results), CINAHL (5948 results) and Cochrane Library (2 results), resulting in 8553 different studies after deduplication. ORP and CH independently screened their titles and abstracts, excluding those that did not meet the criteria. After assessment, 12 studies were included (see Fig.  1 ).

figure 1

PRISMA flowchart of the study selection process

Study characteristics

The included studies were published between 2018 and 2022. Seven were conducted in the UK [ 12 , 14 , 15 , 17 , 18 , 19 , 23 ], two in Canada [ 21 , 22 ], one in Australia [ 13 ], one in Norway [ 16 ] and one in Ireland [ 20 ]. The PPI activities occurred across various settings, including a school [ 12 ], social club [ 12 ], hospital [ 17 ], university [ 22 ], theatre [ 19 ], hotel [ 20 ], or online [ 15 , 21 ], however this information was omitted in 5 studies [ 13 , 14 , 16 , 18 , 23 ]. The number of people attending the PPI sessions varied, ranging from 6 to 289, however the majority (ten studies) had less than 70 participants [ 13 , 14 , 16 , 17 , 18 , 19 , 20 , 21 , 22 , 23 ]. Seven studies did not provide information on the age or gender of the PPI groups. Of those that did, ages ranged from 8 to 76 and were mostly female. The ethnicities of the PPI group members were also rarely recorded (see Additional file 3 for data extraction table).

Types of creative methods

The type of creative methods used to engage the PPI groups were varied. These included songs, poems, drawings, photograph elicitation, drama performance, visualisations, Facebook, photography, prototype development, cultural animation, card sorting and creating personas (see Table  1 ). These were sometimes accompanied by traditional methods of PPI such as interviews and focus group discussions.

The 12 included studies were all deemed to be of good methodological quality, with scores ranging from 6/10 to 10/10 with the CASP critical appraisal tool [ 10 ] (Table  2 ).

Thematic analysis

Analysis identified four limitations and five strengths to creative PPI (see Fig.  2 ). Limitations included the time and resource intensity of creative PPI methods, its lack of generalisation, ethical issues and external factors. Strengths included the disruption of power hierarchies, the engaging and inclusive nature of the methods and their long-term cost and time efficiency. Creative PPI methods also allowed mundane and “taboo” topics to be discussed within a safe space.

figure 2

Theme map of strengths and limitations

Limitations of creative PPI

Creative ppi methods are time and resource intensive.

The time and resource intensive nature of creative PPI methods is a limitation, most notably for the persona-scenario methodology. Valaitis et al. [ 22 ] used 14 persona-scenario workshops with 70 participants to co-design a healthcare intervention, which aimed to promote optimal aging in Canada. Using the persona method, pairs composed of patients, healthcare providers, community service providers and volunteers developed a fictional character which they believed represented an ‘end-user’ of the healthcare intervention. Due to the depth and richness of the data produced the authors reported that it was time consuming to analyse. Further, they commented that the amount of information was difficult to disseminate to scientific leads and present at team meetings. Additionally, to ensure the production of high-quality data, to probe for details and lead group discussion there was a need for highly skilled facilitators. The resource intensive nature of the creative co-production was also noted in a study using the persona scenario and creative worksheets to develop a prototype decision support tool for individuals with malignant pleural effusion [ 17 ]. With approximately 50 people, this was also likely to yield a high volume of data to consider.

To prepare materials for populations who cannot engage in traditional methods of PPI was also timely. Kearns et al. [ 18 ] developed a feedback questionnaire for people with aphasia to evaluate ICT-delivered rehabilitation. To ensure people could participate effectively, the resources used during the workshops, such as PowerPoints, online images and photographs, had to be aphasia-accessible, which was labour and time intensive. The author warned that this time commitment should not be underestimated.

There are further practical limitations to implementing creative PPI, such as the costs of materials for activities as well as hiring a space for workshops. For example, the included studies in this review utilised pens, paper, worksheets, laptops, arts and craft supplies and magazines and took place in venues such as universities, a social club, and a hotel. Further, although not limited to creative PPI methods exclusively but rather most studies involving the public, a financial incentive was often offered for participation, as well as food, parking, transport and accommodation [ 21 , 22 ].

Creative PPI lacks generalisation

Another barrier to the use of creative PPI methods in health and social care research was the individual nature of its output. Those who participate, usually small in number, produce unique creative outputs specific to their own experiences, opinions and location. Craven et al. [ 13 ], used arts-based visualisations to develop a toolbox for adults with mental health difficulties. They commented, “such an approach might still not be worthwhile”, as the visualisations were individualised and highly personal. This indicates that the output may fail to meet the needs of its end-users. Further, these creative PPI groups were based in certain geographical regions such as Stoke-on-Trent [ 19 ] Sheffield [ 23 ], South Wales [ 12 ] or Ireland [ 20 ], which limits the extent the findings can be applied to wider populations, even within the same area due to individual nuances. Further, the study by Galler et al. [ 16 ], is specific to the Norwegian context and even then, maybe only a sub-group of the Norwegian population as the sample used was of higher socioeconomic status.

However, Grindell et al. [ 17 ], who used persona scenarios, creative worksheets and prototype development, pointed out that the purpose of this type of research is to improve a certain place, rather than apply findings across other populations and locations. Individualised output may, therefore, only be a limitation to research wanting to conduct PPI on a large scale.

If, however, greater generalisation within PPI is deemed necessary, then social media may offer a resolution. Fedorowicz et al. [ 15 ], used Facebook to gain feedback from the public on the use of video-recording methodology for an upcoming project. This had the benefit of including a more diverse range of people (289 people joined the closed group), who were spread geographically around the UK, as well as seven people from overseas.

Creative PPI has ethical issues

As with other research, ethical issues must be taken into consideration. Due to the nature of creative approaches, as well as the personal effort put into them, people often want to be recognised for their work. However, this compromises principles so heavily instilled in research such as anonymity and confidentiality. With the aim of exploring issues related to health and well-being in a town in South Wales, Byrne et al. [ 12 ], asked year 4/5 and year 10 pupils to create poems, songs, drawings and photographs. Community members also created a performance, mainly of monologues, to explore how poverty and inequalities are dealt with. Byrne noted the risks of these arts-based approaches, that being the possibility of over-disclosure and consequent emotional distress, as well as people’s desire to be named for their work. On one hand, the anonymity reduces the sense of ownership of the output as it does not portray a particular individual’s lived experience anymore. On the other hand, however, it could promote a more honest account of lived experience. Supporting this, Webber et al. [ 23 ], who used the persona method to co-design a back pain educational resource prototype, claimed that the anonymity provided by this creative technique allowed individuals to externalise and anonymise their own personal experience, thus creating a more authentic and genuine resource for future users. This implies that anonymity can be both a limitation and strength here.

The use of creative PPI methods is impeded by external factors

Despite the above limitations influencing the implementation of creative PPI techniques, perhaps the most influential is that creative methodologies are simply not mainstream [ 19 ]. This could be linked to the issues above, like time and resource intensity, generalisation and ethical issues but it is also likely to involve more systemic factors within the research community. Micsinszki et al. [ 21 ], who co-designed a hub for the health and well-being of vulnerable populations, commented that there is insufficient infrastructure to conduct meaningful co-design as well as a dominant medical model. Through a more holistic lens, there are “sociopolitical environments that privilege individualism over collectivism, self-sufficiency over collaboration, and scientific expertise over other ways of knowing based on lived experience” [ 21 ]. This, it could be suggested, renders creative co-design methodologies, which are based on the foundations of collectivism, collaboration and imagination an invalid technique in the research field, which is heavily dominated by more scientific methods offering reproducibility, objectivity and reliability.

Although we acknowledge that creative PPI techniques are not always appropriate, it may be that their main limitation is the lack of awareness of these methods or lack of willingness to use them. Further, there is always the risk that PPI, despite being a mandatory part of research, is used in a tokenistic or tick-box fashion [ 20 ], without considering the contribution that meaningful PPI could make to enhancing the research. It may be that PPI, let alone creative PPI, is not at the forefront of researchers’ minds when planning research.

Strengths of creative PPI

Creative ppi disrupts power hierarchies.

One of the main strengths of creative PPI techniques, cited most frequently in the included literature, was that they disrupt traditional power hierarchies [ 12 , 13 , 17 , 19 , 23 ]. For example, the use of theatre performance blurred the lines between professional and lay roles between the community and policy makers [ 12 ]. Individuals created a monologue to portray how poverty and inequality impact daily life and presented this to representatives of the National Assembly of Wales, Welsh Government, the Local Authority, Arts Council and Westminster. Byrne et al. [ 12 ], states how this medium allowed the community to engage with the people who make decisions about their lives in an environment of respect and understanding, where the hierarchies are not as visible as in other settings, e.g., political surgeries. Creative PPI methods have also removed traditional power hierarchies between researchers and adolescents. Cook et al. [ 13 ], used arts-based approaches to explore adolescents’ ideas about the “perfect” condom. They utilised the “Life Happens” resource, where adolescents drew and then decorated a person with their thoughts about sexual relationships, not too dissimilar from the persona-scenario method. This was then combined with hypothetical scenarios about sexuality. A condom-mapping exercise was then implemented, where groups shared the characteristics that make a condom “perfect” on large pieces of paper. Cook et al. [ 13 ], noted that usually power imbalances make it difficult to elicit information from adolescents, however these power imbalances were reduced due to the use of creative co-design techniques.

The same reduction in power hierarchies was noted by Grindell et al. [ 17 ], who used the person-scenario method and creative worksheets with individuals with malignant pleural effusion. This was with the aim of developing a prototype of a decision support tool for patients to help with treatment options. Although this process involved a variety of stakeholders, such as patients, carers and healthcare professionals, creative co-design was cited as a mechanism that worked to reduce power imbalances – a limitation of more traditional methods of research. Creative co-design blurred boundaries between end-users and clinical staff and enabled the sharing of ideas from multiple, valuable perspectives, meaning the prototype was able to suit user needs whilst addressing clinical problems.

Similarly, a specific creative method named cultural animation was also cited to dissolve hierarchies and encourage equal contributions from participants. Within this arts-based approach, Keleman et al. [ 19 ], explored the concept of “good health” with individuals from Stoke-on Trent. Members of the group created art installations using ribbons, buttons, cardboard and straws to depict their idea of a “healthy community”, which was accompanied by a poem. They also created a 3D Facebook page and produced another poem or song addressing the government to communicate their version of a “picture of health”. Public participants said that they found the process empowering, honest, democratic, valuable and practical.

This dissolving of hierarchies and levelling of power is beneficial as it increases the sense of ownership experienced by the creators/producers of the output [ 12 , 17 , 23 ]. This is advantageous as it has been suggested to improve its quality [ 23 ].

Creative PPI allows the unsayable to be said

Creative PPI fosters a safe space for mundane or taboo topics to be shared, which may be difficult to communicate using traditional methods of PPI. For example, the hypothetical nature of condom mapping and persona-scenarios meant that adolescents could discuss a personal topic without fear of discrimination, judgement or personal disclosure [ 13 ]. The safe space allowed a greater volume of ideas to be generated amongst peers where they might not have otherwise. Similarly, Webber et al. [ 23 ], , who used the persona method to co-design the prototype back pain educational resource, also noted how this method creates anonymity whilst allowing people the opportunity to externalise personal experiences, thoughts and feelings. Other creative methods were also used, such as drawing, collaging, role play and creating mood boards. A cardboard cube (labelled a “magic box”) was used to symbolise a physical representation of their final prototype. These creative methods levelled the playing field and made personal experiences accessible in a safe, open environment that fostered trust, as well as understanding from the researchers.

It is not only sensitive subjects that were made easier to articulate through creative PPI. The communication of mundane everyday experiences were also facilitated, which were deemed typically ‘unsayable’. This was specifically given in the context of describing intangible aspects of everyday health and wellbeing [ 11 ]. Graphic designers can also be used to visually represent the outputs of creative PPI. These captured the movement and fluidity of people and well as the relationships between them - things that cannot be spoken but can be depicted [ 21 ].

Creative PPI methods are inclusive

Another strength of creative PPI was that it is inclusive and accessible [ 17 , 19 , 21 ]. The safe space it fosters, as well as the dismantling of hierarchies, welcomed people from a diverse range of backgrounds and provided equal opportunities [ 21 ], especially for those with communication and memory difficulties who might be otherwise excluded from PPI. Kelemen et al. [ 19 ], who used creative methods to explore health and well-being in Stoke-on-Trent, discussed how people from different backgrounds came together and connected, discussed and reached a consensus over a topic which evoked strong emotions, that they all have in common. Individuals said that the techniques used “sets people to open up as they are not overwhelmed by words”. Similarly, creative activities, such as the persona method, have been stated to allow people to express themselves in an inclusive environment using a common language. Kearns et al. [ 18 ], who used aphasia-accessible material to develop a questionnaire with aphasic individuals, described how they felt comfortable in contributing to workshops (although this material was time-consuming to make, see ‘Limitations of creative PPI’ ).

Despite the general inclusivity of creative PPI, it can also be exclusive, particularly if online mediums are used. Fedorowicz et al. [ 15 ], used Facebook to create a PPI group, and although this may rectify previous drawbacks about lack of generalisation of creative methods (as Facebook can reach a greater number of people, globally), it excluded those who are not digitally active or have limited internet access or knowledge of technology. Online methods have other issues too. Maintaining the online group was cited as challenging and the volume of responses required researchers to interact outside of their working hours. Despite this, online methods like Facebook are very accessible for people who are physically disabled.

Creative PPI methods are engaging

The process of creative PPI is typically more engaging and produces more colourful data than traditional methods [ 13 ]. Individuals are permitted and encouraged to explore a creative self [ 19 ], which can lead to the exploration of new ideas and an overall increased enjoyment of the process. This increased engagement is particularly beneficial for younger PPI groups. For example, to involve children in the development of health food products, Galler et al. [ 16 ] asked 9-12-year-olds to take photos of their food and present it to other children in a “show and tell” fashion. They then created a newspaper article describing a new healthy snack. In this creative focus group, children were given lab coats to further their identity as inventors. Galler et al. [ 16 ], notes that the methods were highly engaging and facilitated teamwork and group learning. This collaborative nature of problem-solving was also observed in adults who used personas and creative worksheets to develop the resource for lower back pain [ 23 ]. Dementia patients too have been reported to enjoy the creative and informal approach to idea generation [ 20 ].

The use of cultural animation allowed people to connect with each other in a way that traditional methods do not [ 19 , 21 ]. These connections were held in place by boundary objects, such as ribbons, buttons, fabric and picture frames, which symbolised a shared meaning between people and an exchange of knowledge and emotion. Asking groups to create an art installation using these objects further fostered teamwork and collaboration, both at an individual and collective level. The exploration of a creative self increased energy levels and encouraged productive discussions and problem-solving [ 19 ]. Objects also encouraged a solution-focused approach and permitted people to think beyond their usual everyday scope [ 17 ]. They also allowed facilitators to probe deeper about the greater meanings carried by the object, which acted as a metaphor [ 21 ].

From the researcher’s point of view, co-creative methods gave rise to ideas they might not have initially considered. Valaitis et al. [ 22 ], found that over 40% of the creative outputs were novel ideas brought to light by patients, healthcare providers/community care providers, community service providers and volunteers. One researcher commented, “It [the creative methods] took me on a journey, in a way that when we do other pieces of research it can feel disconnected” [ 23 ]. Another researcher also stated they could not return to the way they used to do research, as they have learnt so much about their own health and community and how they are perceived [ 19 ]. This demonstrates that creative processes not only benefit the project outcomes and the PPI group, but also facilitators and researchers. However, although engaging, creative methods have been criticised for not demonstrating academic rigour [ 17 ]. Moreover, creative PPI may also be exclusive to people who do not like or enjoy creative activities.

Creative PPI methods are cost and time efficient

Creative PPI workshops can often produce output that is visible and tangible. This can save time and money in the long run as the output is either ready to be implemented in a healthcare setting or a first iteration has already been developed. This may also offset the time and costs it takes to implement creative PPI. For example, the prototype of the decision support tool for people with malignant pleural effusion was developed using personas and creative worksheets. The end result was two tangible prototypes to drive the initial idea forward as something to be used in practice [ 17 ]. The use of creative co-design in this case saved clinician time as well as the time it would take to develop this product without the help of its end-users. In the development of this particular prototype, analysis was iterative and informed the next stage of development, which again saved time. The same applies for the feedback questionnaire for the assessment of ICT delivered aphasia rehabilitation. The co-created questionnaire, designed with people with aphasia, was ready to be used in practice [ 18 ]. This suggests that to overcome time and resource barriers to creative PPI, researchers should aim for it to be engaging whilst also producing output.

That useable products are generated during creative workshops signals to participating patients and public members that they have been listened to and their thoughts and opinions acted upon [ 23 ]. For example, the development of the back pain resource based on patient experiences implies that their suggestions were valid and valuable. Further, those who participated in the cultural animation workshop reported that the process visualises change, and that it already feels as though the process of change has started [ 19 ].

The most cost and time efficient method of creative PPI in this review is most likely the use of Facebook to gather feedback on project methodology [ 15 ]. Although there were drawbacks to this, researchers could involve more people from a range of geographical areas at little to no cost. Feedback was instantaneous and no training was required. From the perspective of the PPI group, they could interact however much or little they wish with no time commitment.

This systematic review identified four limitations and five strengths to the use of creative PPI in health and social care research. Creative PPI is time and resource intensive, can raise ethical issues and lacks generalisability. It is also not accepted by the mainstream. These factors may act as barriers to the implementation of creative PPI. However, creative PPI disrupts traditional power hierarchies and creates a safe space for taboo or mundane topics. It is also engaging, inclusive and can be time and cost efficient in the long term.

Something that became apparent during data analysis was that these are not blanket strengths and limitations of creative PPI as a whole. The umbrella term ‘creative PPI’ is broad and encapsulates a wide range of activities, ranging from music and poems to prototype development and persona-scenarios, to more simplistic things like the use of sticky notes and ordering cards. Many different activities can be deemed ‘creative’ and the strengths and limitations of one does not necessarily apply to another. For example, cultural animation takes greater effort to prepare than the use of sticky notes and sorting cards, and the use of Facebook is cheaper and wider reaching than persona development. Researchers should use their discretion and weigh up the benefits and drawbacks of each method to decide on a technique which suits the project. What might be a limitation to creative PPI in one project may not be in another. In some cases, creative PPI may not be suitable at all.

Furthermore, the choice of creative PPI method also depends on the needs and characteristics of the PPI group. Children, adults and people living with dementia or language difficulties all have different engagement needs and capabilities. This indicates that creative PPI is not one size fits all and that the most appropriate method will change depending on the composition of the group. The choice of method will also be determined by the constraints of the research project, namely time, money and the research aim. For example, if there are time constraints, then a method which yields a lot of data and requires a lot of preparation may not be appropriate. If generalisation is important, then an online method is more suitable. Together this indicates that the choice of creative PPI method is highly individualised and dependent on multiple factors.

Although the limitations discussed in this review apply to creative PPI, they are not exclusive to creative PPI. Ethical issues are a consideration within general PPI research, especially when working with more vulnerable populations, such as children or adults living with a disability. It can also be the case that traditional PPI methods lack generalisability, as people who volunteer to be part of such a group are more likely be older, middle class and retired [ 24 ]. Most research is vulnerable to this type of bias, however, it is worth noting that generalisation is not always a goal and research remains valid and meaningful in its absence. Although online methods may somewhat combat issues related to generalisability, these methods still exclude people who do not have access to the internet/technology or who choose not to use it, implying that online PPI methods may not be wholly representative of the general population. Saying this, however, the accessibility of creative PPI techniques differs from person to person, and for some, online mediums may be more accessible (for example for those with a physical disability), and for others, this might be face-to-face. To combat this, a range of methods should be implemented. Planning multiple focus group and interviews for traditional PPI is also time and resource intensive, however the extra resources required to make this creative may be even greater. Although, the rich data provided may be worth the preparation and analysis time, which is also likely to depend on the number of participants and workshop sessions required. PPI, not just creative PPI, often requires the provision of a financial incentive, refreshments, parking and accommodation, which increase costs. These, however, are imperative and non-negotiable, as they increase the accessibility of research, especially to minority and lower-income groups less likely to participate. Adequate funding is also important for co-design studies where repeated engagement is required. One barrier to implementation, which appears to be exclusive to creative methods, however, is that creative methods are not mainstream. This cannot be said for traditional PPI as this is often a mandatory part of research applications.

Regarding the strengths of creative PPI, it could be argued that most appear to be exclusive to creative methodologies. These are inclusive by nature as multiple approaches can be taken to evoke ideas from different populations - approaches that do not necessarily rely on verbal or written communication like interviews and focus groups do. Given the anonymity provided by some creative methods, such as personas, people may be more likely to discuss their personal experiences under the guise of a general end-user, which might be more difficult to maintain when an interviewer is asking an individual questions directly. Additionally, creative methods are by nature more engaging and interactive than traditional methods, although this is a blanket statement and there may be people who find the question-and-answer/group discussion format more engaging. Creative methods have also been cited to eliminate power imbalances which exist in traditional research [ 12 , 13 , 17 , 19 , 23 ]. These imbalances exist between researchers and policy makers and adolescents, adults and the community. Lastly, although this may occur to a greater extent in creative methods like prototype development, it could be suggested that PPI in general – regardless of whether it is creative - is more time and cost efficient in the long-term than not using any PPI to guide or refine the research process. It must be noted that these are observations based on the literature. To be certain these differences exist between creative and traditional methods of PPI, direct empirical evaluation of both should be conducted.

To the best of our knowledge, this is the first review to identify the strengths and limitations to creative PPI, however, similar literature has identified barriers and facilitators to PPI in general. In the context of clinical trials, recruitment difficulties were cited as a barrier, as well as finding public contributors who were free during work/school hours. Trial managers reported finding group dynamics difficult to manage and the academic environment also made some public contributors feel nervous and lacking confidence to speak. Facilitators, however, included the shared ownership of the research – something that has been identified in the current review too. In addition, planning and the provision of knowledge, information and communication were also identified as facilitators [ 25 ]. Other research on the barriers to meaningful PPI in trial oversight committees included trialist confusion or scepticism over the PPI role and the difficulties in finding PPI members who had a basic understanding of research [ 26 ]. However, it could be argued that this is not representative of the average patient or public member. The formality of oversight meetings and the technical language used also acted as a barrier, which may imply that the informal nature of creative methods and its lack of dependency on literacy skills could overcome this. Further, a review of 42 reviews on PPI in health and social care identified financial compensation, resources, training and general support as necessary to conduct PPI, much like in the current review where the resource intensiveness of creative PPI was identified as a limitation. However, others were identified too, such as recruitment and representativeness of public contributors [ 27 ]. Like in the current review, power imbalances were also noted, however this was included as both a barrier and facilitator. Collaboration seemed to diminish hierarchies but not always, as sometimes these imbalances remained between public contributors and healthcare staff, described as a ‘them and us’ culture [ 27 ]. Although these studies compliment the findings of the current review, a direct comparison cannot be made as they do not concern creative methods. However, it does suggest that some strengths and weaknesses are shared between creative and traditional methods of PPI.

Strengths and limitations of this review

Although a general definition of creative PPI exists, it was up to our discretion to decide exactly which activities were deemed as such for this review. For example, we included sorting cards, the use of interactive whiteboards and sticky notes. Other researchers may have a more or less stringent criteria. However, two reviewers were involved in this decision which aids the reliability of the included articles. Further, it may be that some of the strengths and limitations cannot fully be attributed to the creative nature of the PPI process, but rather their co-created nature, however this is hard to disentangle as the included papers involved both these aspects.

During screening, it was difficult to decide whether the article was utilising creative qualitative methodology or creative PPI , as it was often not explicitly labelled as such. Regardless, both approaches involved the public/patients refining a healthcare product/service. This implies that if this review were to be replicated, others may do it differently. This may call for greater standardisation in the reporting of the public’s involvement in research. For example, the NIHR outlines different approaches to PPI, namely “consultation”, “collaboration”, “co-production” and “user-controlled”, which each signify an increased level of public power and influence [ 28 ]. Papers with elements of PPI could use these labels to clarify the extent of public involvement, or even explicitly state that there was no PPI. Further, given our decision to include only scholarly peer-reviewed literature, it is possible that data were missed within the grey literature. Similarly, the literature search will not have identified all papers relating to different types of accessible inclusion. However, the intent of the review was to focus solely on those within the definition of creative.

This review fills a gap in the literature and helps circulate and promote the concept of creative PPI. Each stage of this review, namely screening and quality appraisal, was conducted by two independent reviewers. However, four full texts could not be accessed during the full text reading stage, meaning there are missing data that could have altered or contributed to the findings of this review.

Research recommendations

Given that creative PPI can require effort to prepare, perform and analyse, sufficient time and funding should be allocated in the research protocol to enable meaningful and continuous PPI. This is worthwhile as PPI can significantly change the research output so that it aligns closely with the needs of the group it is to benefit. Researchers should also consider prototype development as a creative PPI activity as this might reduce future time/resource constraints. Shifting from a top-down approach within research to a bottom-up can be advantageous to all stakeholders and can help move creative PPI towards the mainstream. This, however, is the collective responsibility of funding bodies, universities and researchers, as well as committees who approve research bids.

A few of the included studies used creative techniques alongside traditional methods, such as interviews, which could also be used as a hybrid method of PPI, perhaps by researchers who are unfamiliar with creative techniques or to those who wish to reap the benefits of both. Often the characteristics of the PPI group were not included, including age, gender and ethnicity. It would be useful to include such information to assess how representative the PPI group is of the population of interest.

Creative PPI is a relatively novel approach of engaging the public and patients in research and it has both advantages and disadvantages compared to more traditional methods. There are many approaches to implementing creative PPI and the choice of technique will be unique to each piece of research and is reliant on several factors. These include the age and ability of the PPI group as well as the resource limitations of the project. Each method has benefits and drawbacks, which should be considered at the protocol-writing stage. However, given adequate funding, time and planning, creative PPI is a worthwhile and engaging method of generating ideas with end-users of research – ideas which may not be otherwise generated using traditional methods.

Data availability

No datasets were generated or analysed during the current study.

Abbreviations

Critical Appraisal Skills Programme

The Joanna Briggs Institute

National Institute of Health and Care Research

Public Advisory Group

Public and Patient Involvement

Web of Science

National Institute for Health and Care Research. What Is Patient and Public Involvement and Public Engagement? https://www.spcr.nihr.ac.uk/PPI/what-is-patient-and-public-involvement-and-engagement Accessed 01 Sept 2023.

Department of Health. Personal and Public Involvement (PPI) https://www.health-ni.gov.uk/topics/safety-and-quality-standards/personal-and-public-involvement-ppi#:~:text=The Health and Social Care Reform Act (NI) 2009 placed,delivery and evaluation of services . Accessed 01 Sept 2023.

National Institute for Health and Care Research. Policy Research Programme – Guidance for Stage 1 Applications https://www.nihr.ac.uk/documents/policy-research-programme-guidance-for-stage-1-applications-updated/26398 Accessed 01 Sept 2023.

Greenhalgh T, Hinton L, Finlay T, Macfarlane A, Fahy N, Clyde B, Chant A. Frameworks for supporting patient and public involvement in research: systematic review and co-design pilot. Health Expect. 2019. https://doi.org/10.1111/hex.12888

Article   PubMed   PubMed Central   Google Scholar  

Street JM, Stafinski T, Lopes E, Menon D. Defining the role of the public in health technology assessment (HTA) and HTA-informed decision-making processes. Int J Technol Assess Health Care. 2020. https://doi.org/10.1017/S0266462320000094

Article   PubMed   Google Scholar  

Morrison C, Dearden A. Beyond tokenistic participation: using representational artefacts to enable meaningful public participation in health service design. Health Policy. 2013. https://doi.org/10.1016/j.healthpol.2013.05.008

Leavy P. Method meets art: arts-Based Research Practice. New York: Guilford; 2020.

Google Scholar  

Seers K. Qualitative systematic reviews: their importance for our understanding of research relevant to pain. Br J Pain. 2015. https://doi.org/10.1177/2049463714549777

Lockwood C, Porritt K, Munn Z, Rittenmeyer L, Salmond S, Bjerrum M, Loveday H, Carrier J, Stannard D. Chapter 2: Systematic reviews of qualitative evidence. Aromataris E, Munn Z, editors. JBI Manual for Evidence Synthesis JBI. 2020. https://synthesismanual.jbi.global . https://doi.org/10.46658/JBIMES-20-03

CASP. CASP Checklists https://casp-uk.net/images/checklist/documents/CASP-Qualitative-Studies-Checklist/CASP-Qualitative-Checklist-2018_fillable_form.pdf (2022).

Braun V, Clarke V. Using thematic analysis in psychology. Qualitative Res Psychol. 2006. https://doi.org/10.1191/1478088706qp063oa

Article   Google Scholar  

Byrne E, Elliott E, Saltus R, Angharad J. The creative turn in evidence for public health: community and arts-based methodologies. J Public Health. 2018. https://doi.org/10.1093/pubmed/fdx151

Cook S, Grozdanovski L, Renda G, Santoso D, Gorkin R, Senior K. Can you design the perfect condom? Engaging young people to inform safe sexual health practice and innovation. Sex Educ. 2022. https://doi.org/10.1080/14681811.2021.1891040

Craven MP, Goodwin R, Rawsthorne M, Butler D, Waddingham P, Brown S, Jamieson M. Try to see it my way: exploring the co-design of visual presentations of wellbeing through a workshop process. Perspect Public Health. 2019. https://doi.org/10.1177/1757913919835231

Fedorowicz S, Riley V, Cowap L, Ellis NJ, Chambers R, Grogan S, Crone D, Cottrell E, Clark-Carter D, Roberts L, Gidlow CJ. Using social media for patient and public involvement and engagement in health research: the process and impact of a closed Facebook group. Health Expect. 2022. https://doi.org/10.1111/hex.13515

Galler M, Myhrer K, Ares G, Varela P. Listening to children voices in early stages of new product development through co-creation – creative focus group and online platform. Food Res Int. 2022. https://doi.org/10.1016/j.foodres.2022.111000

Grindell C, Tod A, Bec R, Wolstenholme D, Bhatnagar R, Sivakumar P, Morley A, Holme J, Lyons J, Ahmed M, Jackson S, Wallace D, Noorzad F, Kamalanathan M, Ahmed L, Evison M. Using creative co-design to develop a decision support tool for people with malignant pleural effusion. BMC Med Inf Decis Mak. 2020. https://doi.org/10.1186/s12911-020-01200-3

Kearns Á, Kelly H, Pitt I. Rating experience of ICT-delivered aphasia rehabilitation: co-design of a feedback questionnaire. Aphasiology. 2020. https://doi.org/10.1080/02687038.2019.1649913

Kelemen M, Surman E, Dikomitis L. Cultural animation in health research: an innovative methodology for patient and public involvement and engagement. Health Expect. 2018. https://doi.org/10.1111/hex.12677

Keogh F, Carney P, O’Shea E. Innovative methods for involving people with dementia and carers in the policymaking process. Health Expect. 2021. https://doi.org/10.1111/hex.13213

Micsinszki SK, Buettgen A, Mulvale G, Moll S, Wyndham-West M, Bruce E, Rogerson K, Murray-Leung L, Fleisig R, Park S, Phoenix M. Creative processes in co-designing a co-design hub: towards system change in health and social services in collaboration with structurally vulnerable populations. Evid Policy. 2022. https://doi.org/10.1332/174426421X16366319768599

Valaitis R, Longaphy J, Ploeg J, Agarwal G, Oliver D, Nair K, Kastner M, Avilla E, Dolovich L. Health TAPESTRY: co-designing interprofessional primary care programs for older adults using the persona-scenario method. BMC Fam Pract. 2019. https://doi.org/10.1186/s12875-019-1013-9

Webber R, Partridge R, Grindell C. The creative co-design of low back pain education resources. Evid Policy. 2022. https://doi.org/10.1332/174426421X16437342906266

National Institute for Health and Care Research. A Researcher’s Guide to Patient and Public Involvement. https://oxfordbrc.nihr.ac.uk/wp-content/uploads/2017/03/A-Researchers-Guide-to-PPI.pdf Accessed 01 Nov 2023.

Selman L, Clement C, Douglas M, Douglas K, Taylor J, Metcalfe C, Lane J, Horwood J. Patient and public involvement in randomised clinical trials: a mixed-methods study of a clinical trials unit to identify good practice, barriers and facilitators. Trials. 2021 https://doi.org/10.1186/s13063-021-05701-y

Coulman K, Nicholson A, Shaw A, Daykin A, Selman L, Macefield R, Shorter G, Cramer H, Sydes M, Gamble C, Pick M, Taylor G, Lane J. Understanding and optimising patient and public involvement in trial oversight: an ethnographic study of eight clinical trials. Trials. 2020. https://doi.org/10.1186/s13063-020-04495-9

Ocloo J, Garfield S, Franklin B, Dawson S. Exploring the theory, barriers and enablers for patient and public involvement across health, social care and patient safety: a systematic review of reviews. Health Res Policy Sys. 2021. https://doi.org/10.1186/s12961-020-00644-3

National Institute for Health and Care Research. Briefing notes for researchers - public involvement in NHS, health and social care research. https://www.nihr.ac.uk/documents/briefing-notes-for-researchers-public-involvement-in-nhs-health-and-social-care-research/27371 Accessed 01 Nov 2023.

Download references

Acknowledgements

With thanks to the PHIRST-LIGHT public advisory group and consortium for their thoughts and contributions to the design of this work.

The research team is supported by a National Institute for Health and Care Research grant (PHIRST-LIGHT Reference NIHR 135190).

Author information

Olivia R. Phillips and Cerian Harries share joint first authorship.

Authors and Affiliations

Nottingham Centre for Public Health and Epidemiology, Lifespan and Population Health, School of Medicine, University of Nottingham, Clinical Sciences Building, City Hospital Campus, Hucknall Road, Nottingham, NG5 1PB, UK

Olivia R. Phillips, Jo Leonardi-Bee, Holly Knight & Joanne R. Morling

National Institute for Health and Care Research (NIHR) PHIRST-LIGHT, Nottingham, UK

Olivia R. Phillips, Cerian Harries, Jo Leonardi-Bee, Holly Knight, Lauren B. Sherar, Veronica Varela-Mato & Joanne R. Morling

School of Sport, Exercise and Health Sciences, Loughborough University, Epinal Way, Loughborough, Leicestershire, LE11 3TU, UK

Cerian Harries, Lauren B. Sherar & Veronica Varela-Mato

Nottingham Centre for Evidence Based Healthcare, School of Medicine, University of Nottingham, Nottingham, UK

Jo Leonardi-Bee

NIHR Nottingham Biomedical Research Centre (BRC), Nottingham University Hospitals NHS Trust, University of Nottingham, Nottingham, NG7 2UH, UK

Joanne R. Morling

You can also search for this author in PubMed   Google Scholar

Contributions

Author contributions: study design: ORP, CH, JRM, JLB, HK, LBS, VVM, literature searching and screening: ORP, CH, JRM, data curation: ORP, CH, analysis: ORP, CH, JRM, manuscript draft: ORP, CH, JRM, Plain English Summary: ORP, manuscript critical review and editing: ORP, CH, JRM, JLB, HK, LBS, VVM.

Corresponding author

Correspondence to Olivia R. Phillips .

Ethics declarations

Ethics approval and consent to participate.

The Ethics Committee of the Faculty of Medicine and Health Sciences, University of Nottingham advised that approval from the ethics committee and consent to participate was not required for systematic review studies.

Consent for publication

Not applicable.

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Electronic supplementary material

Below is the link to the electronic supplementary material.

40900_2024_580_MOESM1_ESM.docx

Additional file 1: Search strings: Description of data: the search strings and filters used in each of the 5 databases in this review

Additional file 2: Quality appraisal questions: Description of data: CASP quality appraisal questions

40900_2024_580_moesm3_esm.docx.

Additional file 3: Table 1: Description of data: elements of the data extraction table that are not in the main manuscript

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Cite this article.

Phillips, O.R., Harries, C., Leonardi-Bee, J. et al. What are the strengths and limitations to utilising creative methods in public and patient involvement in health and social care research? A qualitative systematic review. Res Involv Engagem 10 , 48 (2024). https://doi.org/10.1186/s40900-024-00580-4

Download citation

Received : 28 November 2023

Accepted : 25 April 2024

Published : 13 May 2024

DOI : https://doi.org/10.1186/s40900-024-00580-4

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Public and patient involvement
  • Creative PPI
  • Qualitative systematic review

Research Involvement and Engagement

ISSN: 2056-7529

what is data analysis procedures for qualitative research

  • Open access
  • Published: 13 May 2024

Patient medication management, understanding and adherence during the transition from hospital to outpatient care - a qualitative longitudinal study in polymorbid patients with type 2 diabetes

  • Léa Solh Dost   ORCID: orcid.org/0000-0001-5767-1305 1 , 2 ,
  • Giacomo Gastaldi   ORCID: orcid.org/0000-0001-6327-7451 3 &
  • Marie P. Schneider   ORCID: orcid.org/0000-0002-7557-9278 1 , 2  

BMC Health Services Research volume  24 , Article number:  620 ( 2024 ) Cite this article

131 Accesses

Metrics details

Continuity of care is under great pressure during the transition from hospital to outpatient care. Medication changes during hospitalization may be poorly communicated and understood, compromising patient safety during the transition from hospital to home. The main aims of this study were to investigate the perspectives of patients with type 2 diabetes and multimorbidities on their medications from hospital discharge to outpatient care, and their healthcare journey through the outpatient healthcare system. In this article, we present the results focusing on patients’ perspectives of their medications from hospital to two months after discharge.

Patients with type 2 diabetes, with at least two comorbidities and who returned home after discharge, were recruited during their hospitalization. A descriptive qualitative longitudinal research approach was adopted, with four in-depth semi-structured interviews per participant over a period of two months after discharge. Interviews were based on semi-structured guides, transcribed verbatim, and a thematic analysis was conducted.

Twenty-one participants were included from October 2020 to July 2021. Seventy-five interviews were conducted. Three main themes were identified: (A) Medication management, (B) Medication understanding, and (C) Medication adherence, during three periods: (1) Hospitalization, (2) Care transition, and (3) Outpatient care. Participants had varying levels of need for medication information and involvement in medication management during hospitalization and in outpatient care. The transition from hospital to autonomous medication management was difficult for most participants, who quickly returned to their routines with some participants experiencing difficulties in medication adherence.

Conclusions

The transition from hospital to outpatient care is a challenging process during which discharged patients are vulnerable and are willing to take steps to better manage, understand, and adhere to their medications. The resulting tension between patients’ difficulties with their medications and lack of standardized healthcare support calls for interprofessional guidelines to better address patients’ needs, increase their safety, and standardize physicians’, pharmacists’, and nurses’ roles and responsibilities.

Peer Review reports

Introduction

Continuity of patient care is characterized as the collaborative engagement between the patient and their physician-led care team in the ongoing management of healthcare, with the mutual objective of delivering high-quality and cost-effective medical care [ 1 ]. Continuity of care is under great pressure during the transition of care from hospital to outpatient care, with a risk of compromising patients’ safety [ 2 , 3 ]. The early post-discharge period is a high-risk and fragile transition: once discharged, one in five patients experience at least one adverse event during the first three weeks following discharge, and more than half of these adverse events are drug-related [ 4 , 5 ]. A retrospective study examining all discharged patients showed that adverse drug events (ADEs) account for up to 20% of 30-day hospital emergency readmissions [ 6 ]. During hospitalization, patients’ medications are generally modified, with an average of nearly four medication changes per patient [ 7 ]. Information regarding medications such as medication changes, the expected effect, side effects, and instructions for use are frequently poorly communicated to patients during hospitalization and at discharge [ 8 , 9 , 10 , 11 ]. Between 20 and 60% of discharged patients lack knowledge of their medications [ 12 , 13 ]. Consideration of patients’ needs and their active engagement in decision-making during hospitalization regarding their medications are often lacking [ 11 , 14 , 15 ]. This can lead to unsafe discharge and contribute to medication adherence difficulties, such as non-implementation of newly prescribed medications [ 16 , 17 ].

Patients with multiple comorbidities and polypharmacy are at higher risk of ADE [ 18 ]. Type 2 diabetes is one of the chronic health conditions most frequently associated with comorbidities and patients with type 2 diabetes often lack care continuum [ 19 , 20 , 21 ]. The prevalence of patients hospitalized with type 2 diabetes can exceed 40% [ 22 ] and these patients are at higher risk for readmission due to their comorbidities and their medications, such as insulin and oral hypoglycemic agents [ 23 , 24 , 25 ].

Interventions and strategies to improve patient care and safety at transition have shown mixed results worldwide in reducing cost, rehospitalization, ADE, and non-adherence [ 26 , 27 , 28 , 29 , 30 , 31 , 32 , 33 , 34 , 35 ]. However, interventions that are patient-centered, with a patient follow-up and led by interprofessional healthcare teams showed promising results [ 34 , 35 , 36 ]. Most of these interventions have not been implemented routinely due to the extensive time to translate research into practice and the lack of hybrid implementation studies [ 37 , 38 , 39 , 40 , 41 ]. In addition, patient-reported outcomes and perspectives have rarely been considered, yet patients’ involvement is essential for seamless and integrated care [ 42 , 43 ]. Interprofessional collaboration in which patients are full members of the interprofessional team, is still in its infancy in outpatient care [ 44 ]. Barriers and facilitators regarding medications at the transition of care have been explored in multiple qualitative studies at one given time in a given setting (e.g., at discharge, one-month post-discharge) [ 8 , 45 , 46 , 47 , 48 ]. However, few studies have adopted a holistic methodology from the hospital to the outpatient setting to explore changes in patients’ perspectives over time [ 49 , 50 , 51 ]. Finally, little is known about whether, how, and when patients return to their daily routine following hospitalization and the impact of hospitalization weeks after discharge.

In Switzerland, continuity of care after hospital discharge is still poorly documented, both in terms of contextual analysis and interventional studies, and is mainly conducted in the hospital setting [ 31 , 35 , 52 , 53 , 54 , 55 , 56 ]. The first step of an implementation science approach is to perform a contextual analysis to set up effective interventions adapted to patients’ needs and aligned to healthcare professionals’ activities in a specific context [ 41 , 57 ]. Therefore, the main aims of this study were to investigate the perspectives of patients with type 2 diabetes and multimorbidities on their medications from hospital discharge to outpatient care, and on their healthcare journey through the outpatient healthcare system. In this article, we present the results focusing on patients’ perspectives of their medications from hospital to two months after discharge.

Study design

This qualitative longitudinal study, conducted from October 2020 to July 2021, used a qualitative descriptive methodology through four consecutive in-depth semi-structured interviews per participant at three, 10-, 30- and 60-days post-discharge, as illustrated in Fig.  1 . Longitudinal qualitative research is characterized by qualitative data collection at different points in time and focuses on temporality, such as time and change [ 58 , 59 ]. Qualitative descriptive studies aim to explore and describe the depth and complexity of human experiences or phenomena [ 60 , 61 , 62 ]. We focused our qualitative study on the 60 first days after discharge as this period is considered highly vulnerable and because studies often use 30- or 60-days readmission as an outcome measure [ 5 , 63 ].

This qualitative study follows the Consolidated Criteria for Reporting Qualitative Research (COREQ). Ethics committee approval was sought and granted by the Cantonal Research Ethics Commission, Geneva (CCER) (2020 − 01779).

Recruitment took place during participants’ hospitalization in the general internal medicine divisions at the Geneva University Hospitals in the canton of Geneva (500 000 inhabitants), Switzerland. Interviews took place at participants’ homes, in a private office at the University of Geneva, by telephone or by secure video call, according to participants’ preference. Informal caregivers could also participate alongside the participants.

figure 1

Study flowchart

Researcher characteristics

All the researchers were trained in qualitative studies. The diabetologist and researcher (GG) who enrolled the patients in the study was involved directly or indirectly (advice asked to the Geneva University Hospital diabetes team of which he was a part) for most participants’ care during hospitalization. LS (Ph.D. student and community pharmacist) was unknown to participants and presented herself during hospitalization as a “researcher” and not as a healthcare professional to avoid any risk of influencing participants’ answers. This study was not interventional, and the interviewer (LS) invited participants to contact a healthcare professional for any questions related to their medication or medical issues.

Population and sampling strategy

Patients with type 2 diabetes were chosen as an example population to describe polypharmacy patients as these patients usually have several health issues and polypharmacy [ 20 , 22 , 25 ]. Inclusions criteria for the study were: adult patients with type 2 diabetes, with at least two other comorbidities, hospitalized for at least three days in a general internal medicine ward, with a minimum of one medication change during hospital stay, and who self-managed their medications once discharged home. Exclusion criteria were patients not reachable by telephone following discharge, unable to give consent (patients with schizophrenia, dementia, brain damage, or drug/alcohol misuse), and who could not communicate in French. A purposive sampling methodology was applied aiming to include participants with different ages, genders, types, and numbers of health conditions by listing participants’ characteristics in a double-entry table, available in Supplementary Material 1 , until thematic saturation was reached. Thematic saturation was considered achieved when no new code or theme emerged and new data repeated previously coded information [ 64 ]. The participants were identified if they were hospitalized in the ward dedicated to diabetes care or when the diabetes team was contacted for advice. The senior ward physician (GG) screened eligible patients and the interviewer (LS) obtained written consent before hospital discharge.

Data collection and instruments

Sociodemographic (age, gender, educational level, living arrangement) and clinical characteristics (reason for hospitalization, date of admission, health conditions, diabetes diagnosis, medications before and during hospitalization) were collected by interviewing participants before their discharge and by extracting participants’ data from electronic hospital files by GG and LS. Participants’ pharmacies were contacted with the participant’s consent to obtain medication records from the last three months if information regarding medications before hospitalization was missing in the hospital files.

Semi-structured interview guides for each interview (at three, 10-, 30- and 60-days post-discharge) were developed based on different theories and components of health behavior and medication adherence: the World Health Organization’s (WHO) five dimensions for adherence, the Information-Motivation-Behavioral skills model and the Social Cognitive Theory [ 65 , 66 , 67 ]. Each interview explored participants’ itinerary in the healthcare system and their perspectives on their medications. Regarding medications, the following themes were mentioned at each interview: changes in medications, patients’ understanding and implication; information on their medications, self-management of their medications, and patients’ medication adherence. Other aspects were mentioned in specific interviews: patients’ hospitalization and experience on their return home (interview 1), motivation (interviews 2 and 4), and patient’s feedback on the past two months (interview 4). Interview guides translated from French are available in Supplementary Material 2 . The participants completed self-reported and self-administrated questionnaires at different interviews to obtain descriptive information on different factors that may affect medication management and adherence: self-report questionnaires on quality of life (EQ-5D-5 L) [ 68 ], literacy (Schooling-Opinion-Support questionnaire) [ 69 ], medication adherence (Adherence Visual Analogue Scale, A-VAS) [ 70 ] and Belief in Medication Questionnaire (BMQ) [ 71 ] were administered to each participant at the end of selected interviews to address the different factors that may affect medication management and adherence as well as to determine a trend of determinants over time. The BMQ contains two subscores: Specific-Necessity and Specific-Concerns, addressing respectively their perceived needs for their medications, and their concerns about adverse consequences associated with taking their medication [ 72 ].

Data management

Informed consent forms, including consent to obtain health data, were securely stored in a private office at the University of Geneva. The participants’ identification key was protected by a password known only by MS and LS. Confidentiality was guaranteed by pseudonymization of participants’ information and audio-recordings were destroyed once analyzed. Sociodemographic and clinical characteristics, medication changes, and answers to questionnaires were securely collected by electronic case report forms (eCRFs) on RedCap®. Interviews were double audio-recorded and field notes were taken during interviews. Recorded interviews were manually transcribed verbatim in MAXQDA® (2018.2) by research assistants and LS and transcripts were validated for accuracy by LS. A random sample of 20% of questionnaires was checked for accuracy for the transcription from the paper questionnaires to the eCRFs. Recorded sequences with no link to the discussed topics were not transcribed and this was noted in the transcripts.

Data analysis

A descriptive statistical analysis of sociodemographic, clinical characteristics and self-reported questionnaire data was carried out. A thematic analysis of transcripts was performed, as described by Braun and Clarke [ 73 ], by following six steps: raw data was read, text segments related to the study objectives were identified, text segments to create new categories were identified, similar or redundant categories were reduced and a model that integrated all significant categories was created. The analysis was conducted in parallel with patient enrolment to ensure data saturation. To ensure the validity of the coding method, transcripts were double coded independently and discussed by the research team until similar themes were obtained. The research group developed and validated an analysis grid, with which LS coded systematically the transcriptions and met regularly with the research team to discuss questions on data analysis and to ensure the quality of coding. The analysis was carried out in French, and the verbatims of interest cited in the manuscript were translated and validated by a native English-speaking researcher to preserve the meaning.

In this analysis, we used the term “healthcare professionals” when more than one profession could be involved in participants’ medication management. Otherwise, when a specific healthcare professional was involved, we used the designated profession (e.g. physicians, pharmacists).

Patient and public involvement

During the development phase of the study, interview guides and questionnaires were reviewed for clarity and validity and adapted by two patient partners, with multiple health conditions and who experienced previously a hospital discharge. They are part of the HUG Patients Partners + 3P platform for research and patient and public involvement.

Interviews and participants’ descriptions

A total of 75 interviews were conducted with 21 participants. In total, 31 patients were contacted, seven refused to participate (four at the project presentation and three at consent), two did not enter the selection criteria at discharge and one was unreachable after discharge. Among the 21 participants, 15 participated in all interviews, four in three interviews, one in two interviews, and one in one interview, due to scheduling constraints. Details regarding interviews and participants characteristics are presented in Tables  1 and 2 .

The median length of time between hospital discharge and interviews 1,2,3 and 4 was 5 (IQR: 4–7), 14 (13-20), 35 (22-38), and 63 days (61-68), respectively. On average, by comparing medications at hospital admission and discharge, a median of 7 medication changes (IQR: 6–9, range:2;17) occurred per participant during hospitalization and a median of 7 changes (5–12) during the two months following discharge. Details regarding participants’ medications are described in Table  3 .

Patient self-reported adherence over the past week for their three most challenging medications are available in Supplementary Material 3 .

Qualitative analysis

We defined care transition as the period from discharge until the first medical appointment post-discharge, and outpatient care as the period starting after the first medical appointment. Data was organized into three key themes (A. Medication management, B. Medication understanding, and C. Medication adherence) divided into subthemes at three time points (1. Hospitalization, 2. Care transition and 3. Outpatient care). Figure  2 summarizes and illustrates the themes and subthemes with their influencing factors as bullet points.

figure 2

Participants’ medication management, understanding and adherence during hospitalization, care transition and outpatient care

A. Medication management

A.1 medication management during hospitalization: medication management by hospital staff.

Medications during hospitalization were mainly managed by hospital healthcare professionals (i.e. nurses and physicians) with varying degrees of patient involvement: “At the hospital, they prepared the medications for me. […] I didn’t even know what the packages looked like.” Participant 22; interview 1 (P22.1) Some participants reported having therapeutic education sessions with specialized nurses and physicians, such as the explanation and demonstration of insulin injection and glucose monitoring. A patient reported that he was given the choice of several treatments and was involved in shared decision-making. Other participants had an active role in managing and optimizing dosages, such as rapid insulin, due to prior knowledge and use of medications before hospitalization.

A.2 Medication management at transition: obtaining the medication and initiating self-management

Once discharged, some participants had difficulties obtaining their medications at the pharmacy because some medications were not stored and had to be ordered, delaying medication initiation. To counter this problem upstream, a few participants were provided a 24-to-48-hour supply of medications at discharge. It was sometimes requested by the patient or suggested by the healthcare professionals but was not systematic. The transition from medication management by hospital staff to self-management was exhausting for most participants who were faced with a large amount of new information and changes in their medications: “ When I was in the hospital, I didn’t even realize all the changes. When I came back home, I took away the old medication packages and got out the new ones. And then I thought : « my God, all this…I didn’t know I had all these changes » ” P2.1 Written documentation, such as the discharge prescription or dosage labels on medication packages, was helpful in managing their medication at home. Most participants used weekly pill organizers to manage their medications, which were either already used before hospitalization or were introduced post-discharge. The help of a family caregiver in managing and obtaining medications was reported as a facilitator.

A.3 Medication management in outpatient care: daily self-management and medication burden

A couple of days or weeks after discharge, most participants had acquired a routine so that medication management was less demanding, but the medication burden varied depending on the participants. For some, medication management became a simple action well implemented in their routine (“It has become automatic” , P23.4), while for others, the number of medications and the fact that the medications reminded them of the disease was a heavy burden to bear on a daily basis (“ During the first few days after getting out of the hospital, I thought I was going to do everything right. In the end, well [laughs] it’s complicated. I ended up not always taking the medication, not monitoring the blood sugar” P12.2) To support medication self-management, some participants had written documentation such as treatment plans, medication lists, and pictures of their medication packages on their phones. Some participants had difficulties obtaining medications weeks after discharge as discharge prescriptions were not renewable and participants did not see their physician in time. Others had to visit multiple physicians to have their prescriptions updated. A few participants were faced with prescription or dispensing errors, such as prescribing or dispensing the wrong dosage, which affected medication management and decreased trust in healthcare professionals. In most cases, according to participants, the pharmacy staff worked in an interprofessional collaboration with physicians to provide new and updated prescriptions.

B. Medication understanding

B.1 medication understanding during hospitalization: new information and instructions.

The amount of information received during hospitalization varied considerably among participants with some reporting having received too much, while others saying they received too little information regarding medication changes, the reason for changes, or for introducing new medications: “They told me I had to take this medication all my life, but they didn’t tell me what the effects were or why I was taking it.” P5.3

Hospitalization was seen by some participants as a vulnerable and tiring period during which they were less receptive to information. Information and explanations were generally given verbally, making it complicated for most participants to recall it. Some participants reported that hospital staff was attentive to their needs for information and used communication techniques such as teach-back (a way of checking understanding by asking participants to say in their own words what they need to know or do about their health or medications). Some participants were willing to be proactive in the understanding of their medications while others were more passive, had no specific needs for information, and did not see how they could be engaged more.

B.2 Medication understanding at transition: facing medication changes

At hospital discharge, the most challenging difficulty for participants was to understand the changes made regarding their medications. For newly diagnosed participants, the addition of new medications was more difficult to understand, whereas, for experienced participants, changes in known medications such as dosage modification, changes within a therapeutic class, and generic substitutions were the most difficult to understand. Not having been informed about changes caused confusion and misunderstanding. Therefore, medication reconciliation done by the patient was time-consuming, especially for participants with multiple medications: “ They didn’t tell me at all that they had changed my treatment completely. They just told me : « We’ve changed a few things. But it was the whole treatment ». ” P2.3 Written information, such as the discharge prescription, the discharge report (brief letter summarizing information about the hospitalization, given to the patient at discharge), or the label on the medication box (written by the pharmacist with instructions on dosage) helped them find or recall information about their medications and diagnoses. However, technical terms were used in hospital documentations and were not always understandable. For example, this participant said: “ On the prescription of valsartan, they wrote: ‘resume in the morning once profile…’[once hypertension profile allows]… I don’t know what that means.” P8.1 In addition, some documents were incomplete, as mentioned by a patient who did not have the insulin dosage mentioned on the hospital prescription. Some participants sought help from healthcare professionals, such as pharmacists, hospital physicians, or general practitioners a few days after discharge to review medications, answer questions, or obtain additional information.

B.3 Medication understanding in the outpatient care: concerns and knowledge

Weeks after discharge, most participants had concerns about the long-term use of their medications, their usefulness, and the possible risk of interactions or side effects. Some participants also reported having some lack of knowledge regarding indications, names, or how the medication worked: “I don’t even know what Brilique® [ticagrelor, antiplatelet agent] is for. It’s for blood pressure, isn’t it?. I don’t know.” P11.4 According to participants, the main reasons for the lack of understanding were the lack of information at the time of prescribing and the large number of medications, making it difficult to search for information and remember it. Participants sought information from different healthcare professionals or by themselves, on package inserts, through the internet, or from family and friends. Others reported having had all the information needed or were not interested in having more information. In addition, participants with low medication literacy, such as non-native speakers or elderly people, struggled more with medication understanding and sought help from family caregivers or healthcare professionals, even weeks after discharge: “ I don’t understand French very well […] [The doctor] explained it very quickly…[…] I didn’t understand everything he was saying” P16.2

C. Medication adherence

C.2 medication adherence at transition: adopting new behaviors.

Medication adherence was not mentioned as a concern during hospitalization and a few participants reported difficulties in medication initiation once back home: “I have an injection of Lantus® [insulin] in the morning, but obviously, the first day [after discharge], I forgot to do it because I was not used to it.” P23.1 Participants had to quickly adopt new behaviors in the first few days after discharge, especially for participants with few medications pre-hospitalization. The use of weekly pill organizers, alarms and specific storage space were reported as facilitators to support adherence. One patient did not initiate one of his medications because he did not understand the medication indication, and another patient took her old medications because she was used to them. Moreover, most participants experienced their hospitalization as a turning point, a time when they focused on their health, thought about the importance of their medications, and discussed any new lifestyle or dietary measures that might be implemented.

C.3 Medication adherence in outpatient care: ongoing medication adherence

More medication adherence difficulties appeared a few weeks after hospital discharge when most participants reported nonadherence behaviors, such as difficulties implementing the dosage regimen, or intentionally discontinuing the medication and modifying the medication regimen on their initiative. Determinants positively influencing medication adherence were the establishment of a routine; organizing medications in weekly pill-organizers; organizing pocket doses (medications for a short period that participants take with them when away from home); seeking support from family caregivers; using alarm clocks; and using specific storage places. Reasons for nonadherence were changes in daily routine; intake times that were not convenient for the patient; the large number of medications; and poor knowledge of the medication or side effects. Healthcare professionals’ assistance for medication management, such as the help of home nurses or pharmacists for the preparation of weekly pill-organizers, was requested by participants or offered by healthcare professionals to support medication adherence: “ I needed [a home nurse] to put my pills in the pillbox. […] I felt really weak […] and I was making mistakes. So, I’m very happy [the doctor] offered me [home care]. […] I have so many medications.” P22.3 Some participants who experienced prehospitalization non-adherence were more aware of their non-adherence and implemented strategies, such as modifying the timing of intake: “I said to my doctor : « I forget one time out of two […], can I take them in the morning? » We looked it up and yes, I can take it in the morning.” P11.2 In contrast, some participants were still struggling with adherence difficulties that they had before hospitalization. Motivations for taking medications two months after discharge were to improve health, avoid complications, reduce symptoms, reduce the number of medications in the future or out of obligation: “ I force myself to take them because I want to get to the end of my diabetes, I want to reduce the number of pills as much as possible.” P14.2 After a few weeks post-hospitalization, for some participants, health and illness were no longer the priority because of other life imperatives (e.g., family or financial situation).

This longitudinal study provided a multi-faceted representation of how patients manage, understand, and adhere to their medications from hospital discharge to two months after discharge. Our findings highlighted the varying degree of participants’ involvement in managing their medications during their hospitalization, the individualized needs for information during and after hospitalization, the complicated transition from hospital to autonomous medication management, the adaptation of daily routines around medication once back home, and the adherence difficulties that surfaced in the outpatient care, with nonadherence prior to hospitalization being an indicator of the behavior after discharge. Finally, our results confirmed the lack of continuity in care and showed the lack of patient care standardization experienced by the participants during the transition from hospital to outpatient care.

This in-depth analysis of patients’ experiences reinforces common challenges identified in the existing literature such as the lack of personalized information [ 9 , 10 , 11 ], loss of autonomy during hospitalization [ 14 , 74 , 75 ], difficulties in obtaining medication at discharge [ 11 , 45 , 76 ] and challenges in understanding treatment modifications and generics substitution [ 11 , 32 , 77 , 78 ]. Some of these studies were conducted during patients’ hospitalization [ 10 , 75 , 79 ] or up to 12 months after discharge [ 80 , 81 ], but most studies focused on the few days following hospital discharge [ 9 , 11 , 14 , 82 ]. Qualitative studies on medications at transition often focused on a specific topic, such as medication information, or a specific moment in time, and often included healthcare professionals, which muted patients’ voices [ 9 , 10 , 11 , 47 , 49 ]. Our qualitative longitudinal methodology was interested in capturing the temporal dynamics, in-depth narratives, and contextual nuances of patients’ medication experiences during transitions of care [ 59 , 83 ]. This approach provided a comprehensive understanding of how patients’ perspectives and behaviors evolved over time, offering insights into the complex interactions of medication management, understanding and adherence, and turning points within their medication journeys. A qualitative longitudinal design was used by Fylan et al. to underline patients’ resilience in medication management during and after discharge, by Brandberg et al. to show the dynamic process of self-management during the 4 weeks post-discharge and by Lawton et al. to examine how patients with type 2 diabetes perceived their care after discharge over a period of four years [ 49 , 50 , 51 ]. Our study focused on the first two months following hospitalization and future studies should focus on following discharged and at-risk patients over a longer period, as “transitions of care do not comprise linear trajectories of patients’ movements, with a starting and finishing point. Instead, they are endless loops of movements” [ 47 ].

Our results provide a particularly thorough description of how participants move from a state of total dependency during hospitalization regarding their medication management to a sudden and complete autonomy after hospital discharge impacting medication management, understanding, and adherence in the first days after discharge for some participants. Several qualitative studies have described the lack of shared decision-making and the loss of patient autonomy during hospitalization, which had an impact on self-management and created conflicts with healthcare professionals [ 75 , 81 , 84 ]. Our study also highlights nuanced patient experiences, including varying levels of patient needs, involvement, and proactivity during hospitalization and outpatient care, and our results contribute to capturing different perspectives that contrast with some literature that often portrays patients as more passive recipients of care [ 14 , 15 , 74 , 75 ]. Shared decision-making and proactive medication are key elements as they contribute to a smoother transition and better outcomes for patients post-discharge [ 85 , 86 , 87 ].

Consistent with the literature, the study identifies some challenges in medication initiation post-discharge [ 16 , 17 , 88 ] but our results also describe how daily routine rapidly takes over, either solidifying adherence behavior or generating barriers to medication adherence. Participants’ nonadherence prior to hospitalization was a factor influencing participants’ adherence post-hospitalization and this association should be further investigated, as literature showed that hospitalized patients have high scores of non-adherence [ 89 ]. Mortel et al. showed that more than 20% of discharged patients stopped their medications earlier than agreed with the physician and 25% adapted their medication intake [ 90 ]. Furthermore, patients who self-managed their medications had a lower perception of the necessity of their medication than patients who received help, which could negatively impact medication adherence [ 91 ]. Although participants in our study had high BMQ scores for necessity and lower scores for concerns, some participants expressed doubts about the need for their medications and a lack of motivation a few weeks after discharge. Targeted pharmacy interventions for newly prescribed medications have been shown to improve medication adherence, and hospital discharge is an opportune moment to implement this service [ 92 , 93 ].

Many medication changes were made during the transition of care (a median number of 7 changes during hospitalization and 7 changes during the two months after discharge), especially medication additions during hospitalization and interruptions after hospitalization. While medication changes during hospitalization are well described, the many changes following discharge are less discussed [ 7 , 94 ]. A Danish study showed that approximately 65% of changes made during hospitalization were accepted by primary healthcare professionals but only 43% of new medications initiated during hospitalization were continued after discharge [ 95 ]. The numerous changes after discharge may be caused by unnecessary intensification of medications during hospitalization, delayed discharge letters, lack of standardized procedures, miscommunication, patient self-management difficulties, or in response to an acute situation [ 96 , 97 , 98 ]. During the transition of care, in our study, both new and experienced participants were faced with difficulties in managing and understanding medication changes, either for newly prescribed medication or changes in previous medications. Such difficulties corroborate the findings of the literature [ 9 , 10 , 47 ] and our results showed that the lack of understanding during hospitalization led to participants having questions about their medications, even weeks after discharge. Particular attention should be given to patients’ understanding of medication changes jointly by physicians, nurses and pharmacists during the transition of care and in the months that follow as medications are likely to undergo as many changes as during hospitalization.

Implication for practice and future research

The patients’ perspectives in this study showed, at a system level, that there was a lack of standardization in healthcare professional practices regarding medication dispensing and follow-up. For now, in Switzerland, there are no official guidelines on medication prescription and dispensation during the transition of care although some international guidelines have been developed for outpatient healthcare professionals [ 3 , 99 , 100 , 101 , 102 ]. Here are some suggestions for improvement arising from our results. Patients should be included as partners and healthcare professionals should systematically assess (i) previous medication adherence, (ii) patients’ desired level of involvement and (iii) their needs for information during hospitalization. Hospital discharge processes should be routinely implemented to standardize hospital discharge preparation, medication prescribing, and dispensing. Discharge from the hospital should be planned with community pharmacies to ensure that all medications are available and, if necessary, doses of medications should be supplied by the hospital to bridge the gap. A partnership with outpatient healthcare professionals, such as general practitioners, community pharmacists, and homecare nurses, should be set up for effective asynchronous interprofessional collaboration to consolidate patients’ medication management, knowledge, and adherence, as well as to monitor signs of deterioration or adverse drug events.

Future research should consolidate our first attempt to develop a framework to better characterize medication at the transition of care, using Fig. 2   as a starting point. Contextualized interventions, co-designed by health professionals, patients and stakeholders, should be tested in a hybrid implementation study to test the implementation and effectiveness of the intervention for the health system [ 103 ].

Limitations

This study has some limitations. First, the transcripts were validated for accuracy by the interviewer but not by a third party, which could have increased the robustness of the transcription. Nevertheless, the interviewer followed all methodological recommendations for transcription. Second, patient inclusion took place during the COVID-19 pandemic, which may have had an impact on patient care and the availability of healthcare professionals. Third, we cannot guarantee the accuracy of some participants’ medication history before hospitalization, even though we contacted the participants’ main pharmacy, as participants could have gone to different pharmacies to obtain their medications. Fourth, our findings may not be generalizable to other populations and other healthcare systems because some issues may be specific to multimorbid patients with type 2 diabetes or to the Swiss healthcare setting. Nevertheless, issues encountered by our participants regarding their medications correlate with findings in the literature. Fifth, only 15 out of 21 participants took part in all the interviews, but most participants took part in at least three interviews and data saturation was reached. Lastly, by its qualitative and longitudinal design, it is possible that the discussion during interviews and participants’ reflections between interviews influenced participants’ management, knowledge, and adherence, even though this study was observational, and no advice or recommendations were given by the interviewer during interviews.

Discharged patients are willing to take steps to better manage, understand, and adhere to their medications, yet they are also faced with difficulties in the hospital and outpatient care. Furthermore, extensive changes in medications not only occur during hospitalization but also during the two months following hospital discharge, for which healthcare professionals should give particular attention. The different degrees of patients’ involvement, needs and resources should be carefully considered to enable them to better manage, understand and adhere to their medications. At a system level, patients’ experiences revealed a lack of standardization of medication practices during the transition of care. The healthcare system should provide the ecosystem needed for healthcare professionals responsible for or involved in the management of patients’ medications during the hospital stay, discharge, and outpatient care to standardize their practices while considering the patient as an active partner.

Data availability

The anonymized quantitative survey datasets and the qualitative codes are available in French from the corresponding author on reasonable request.

Abbreviations

adverse drug events

Adherence Visual Analogue Scale

Belief in Medication Questionnaire

Consolidated Criteria for Reporting Qualitative Research

case report form

standard deviation

World Health Organization

American Academy of Family Physician. Continuity of Care, Definition of 2020. Accessed 10 July 2022 https://www.aafp.org/about/policies/all/continuity-of-care-definition.html

Kripalani S, LeFevre F, Phillips CO, Williams MV, Basaviah P, Baker DW. Deficits in communication and information transfer between hospital-based and primary care physicians: implications for patient safety and continuity of care. JAMA. 2007;297(8):831–41.

Article   CAS   PubMed   Google Scholar  

World Health Organization (WHO). Medication Safety in Transitions of Care. 2019.

Forster AJ, Murff HJ, Peterson JF, Gandhi TK, Bates DW. The incidence and severity of adverse events affecting patients after discharge from the hospital. Ann Intern Med. 2003;138(3):161–7.

Article   PubMed   Google Scholar  

Krumholz HM. Post-hospital syndrome–an acquired, transient condition of generalized risk. N Engl J Med. 2013;368(2):100–2.

Article   CAS   PubMed   PubMed Central   Google Scholar  

Banholzer S, Dunkelmann L, Haschke M, Derungs A, Exadaktylos A, Krähenbühl S, et al. Retrospective analysis of adverse drug reactions leading to short-term emergency hospital readmission. Swiss Med Wkly. 2021;151:w20400.

Blozik E, Signorell A, Reich O. How does hospitalization affect continuity of drug therapy: an exploratory study. Ther Clin Risk Manag. 2016;12:1277–83.

Article   PubMed   PubMed Central   Google Scholar  

Allen J, Hutchinson AM, Brown R, Livingston PM. User experience and care for older people transitioning from hospital to home: patients’ and carers’ perspectives. Health Expect. 2018;21(2):518–27.

Daliri S, Bekker CL, Buurman BM, Scholte Op Reimer WJM, van den Bemt BJF, Karapinar-Çarkit F. Barriers and facilitators with medication use during the transition from hospital to home: a qualitative study among patients. BMC Health Serv Res. 2019;19(1):204.

Bekker CL, Mohsenian Naghani S, Natsch S, Wartenberg NS, van den Bemt BJF. Information needs and patient perceptions of the quality of medication information available in hospitals: a mixed method study. Int J Clin Pharm. 2020;42(6):1396–404.

Foulon V, Wuyts J, Desplenter F, Spinewine A, Lacour V, Paulus D, et al. Problems in continuity of medication management upon transition between primary and secondary care: patients’ and professionals’ experiences. Acta Clin Belgica: Int J Clin Lab Med. 2019;74(4):263–71.

Article   Google Scholar  

Micheli P, Kossovsky MP, Gerstel E, Louis-Simonet M, Sigaud P, Perneger TV, et al. Patients’ knowledge of drug treatments after hospitalisation: the key role of information. Swiss Med Wkly. 2007;137(43–44):614–20.

PubMed   Google Scholar  

Ziaeian B, Araujo KL, Van Ness PH, Horwitz LI. Medication reconciliation accuracy and patient understanding of intended medication changes on hospital discharge. J Gen Intern Med. 2012;27(11):1513–20.

Allen J, Hutchinson AM, Brown R, Livingston PM. User experience and care integration in Transitional Care for older people from hospital to home: a Meta-synthesis. Qual Health Res. 2016;27(1):24–36.

Mackridge AJ, Rodgers R, Lee D, Morecroft CW, Krska J. Cross-sectional survey of patients’ need for information and support with medicines after discharge from hospital. Int J Pharm Pract. 2018;26(5):433–41.

Mulhem E, Lick D, Varughese J, Barton E, Ripley T, Haveman J. Adherence to medications after hospital discharge in the elderly. Int J Family Med. 2013;2013:901845.

Fallis BA, Dhalla IA, Klemensberg J, Bell CM. Primary medication non-adherence after discharge from a general internal medicine service. PLoS ONE. 2013;8(5):e61735.

Zhou L, Rupa AP. Categorization and association analysis of risk factors for adverse drug events. Eur J Clin Pharmacol. 2018;74(4):389–404.

Moreau-Gruet F. La multimorbidité chez les personnes de 50 ans et plus. Résultats basés sur l’enqête SHARE (Survey of Health, Ageing and Retirement in Europe. Obsan Bulletin 4/2013. 2013(Neuchâtel: OBservatoire suisse de la santé).

Iglay K, Hannachi H, Joseph Howie P, Xu J, Li X, Engel SS, et al. Prevalence and co-prevalence of comorbidities among patients with type 2 diabetes mellitus. Curr Med Res Opin. 2016;32(7):1243–52.

Sibounheuang P, Olson PS, Kittiboonyakun P. Patients’ and healthcare providers’ perspectives on diabetes management: a systematic review of qualitative studies. Res Social Adm Pharm. 2020;16(7):854–74.

Müller-Wieland D, Merkel M, Hamann A, Siegel E, Ottillinger B, Woker R, et al. Survey to estimate the prevalence of type 2 diabetes mellitus in hospital patients in Germany by systematic HbA1c measurement upon admission. Int J Clin Pract. 2018;72(12):e13273.

Blanc AL, Fumeaux T, Stirnemann J, Dupuis Lozeron E, Ourhamoune A, Desmeules J, et al. Development of a predictive score for potentially avoidable hospital readmissions for general internal medicine patients. PLoS ONE. 2019;14(7):e0219348.

Hansen LO, Greenwald JL, Budnitz T, Howell E, Halasyamani L, Maynard G, et al. Project BOOST: effectiveness of a multihospital effort to reduce rehospitalization. J Hosp Med. 2013;8(8):421–7.

Khalid JM, Raluy-Callado M, Curtis BH, Boye KS, Maguire A, Reaney M. Rates and risk of hospitalisation among patients with type 2 diabetes: retrospective cohort study using the UK General Practice Research Database linked to English Hospital Episode statistics. Int J Clin Pract. 2014;68(1):40–8.

Lussier ME, Evans HJ, Wright EA, Gionfriddo MR. The impact of community pharmacist involvement on transitions of care: a systematic review and meta-analysis. J Am Pharm Assoc. 2020;60(1):153–.

van der Heijden A, de Bruijne MC, Nijpels G, Hugtenburg JG. Cost-effectiveness of a clinical medication review in vulnerable older patients at hospital discharge, a randomized controlled trial. Int J Clin Pharm. 2019;41(4):963–71.

Bingham J, Campbell P, Schussel K, Taylor AM, Boesen K, Harrington A, et al. The Discharge Companion Program: an interprofessional collaboration in Transitional Care Model Delivery. Pharm (Basel). 2019;7(2):68.

Google Scholar  

Farris KB, Carter BL, Xu Y, Dawson JD, Shelsky C, Weetman DB, et al. Effect of a care transition intervention by pharmacists: an RCT. BMC Health Serv Res. 2014;14:406.

Meslot C, Gauchet A, Hagger MS, Chatzisarantis N, Lehmann A, Allenet B. A Randomised Controlled Trial to test the effectiveness of planning strategies to improve Medication Adherence in patients with Cardiovascular Disease. Appl Psychol Health Well Being. 2017;9(1):106–29.

Garnier A, Rouiller N, Gachoud D, Nachar C, Voirol P, Griesser AC, et al. Effectiveness of a transition plan at discharge of patients hospitalized with heart failure: a before-and-after study. ESC Heart Fail. 2018;5(4):657–67.

Daliri S, Bekker CL, Buurman BM, Scholte Op Reimer WJM, van den Bemt BJF, Karapinar-Çarkit F. Medication management during transitions from hospital to home: a focus group study with hospital and primary healthcare providers in the Netherlands. Int J Clin Pharm. 2020.

Hansen LO, Young RS, Hinami K, Leung A, Williams MV. Interventions to reduce 30-day rehospitalization: a systematic review. Ann Intern Med. 2011;155(8):520–8.

Leppin AL, Gionfriddo MR, Kessler M, Brito JP, Mair FS, Gallacher K, et al. Preventing 30-day hospital readmissions: a systematic review and meta-analysis of randomized trials. JAMA Intern Med. 2014;174(7):1095–107.

Donzé J, John G, Genné D, Mancinetti M, Gouveia A, Méan M et al. Effects of a Multimodal Transitional Care Intervention in patients at high risk of readmission: the TARGET-READ Randomized Clinical Trial. JAMA Intern Med. 2023.

Rodrigues CR, Harrington AR, Murdock N, Holmes JT, Borzadek EZ, Calabro K, et al. Effect of pharmacy-supported transition-of-care interventions on 30-Day readmissions: a systematic review and Meta-analysis. Ann Pharmacother. 2017;51(10):866–89.

Lam MYY, Dodds LJ, Corlett SA. Engaging patients to access the community pharmacy medicine review service after discharge from hospital: a cross-sectional study in England. Int J Clin Pharm. 2019;41(4):1110–7.

Hossain LN, Fernandez-Llimos F, Luckett T, Moullin JC, Durks D, Franco-Trigo L, et al. Qualitative meta-synthesis of barriers and facilitators that influence the implementation of community pharmacy services: perspectives of patients, nurses and general medical practitioners. BMJ Open. 2017;7(9):e015471.

En-Nasery-de Heer S, Uitvlugt EB, Bet PM, van den Bemt BJF, Alai A, van den Bemt P et al. Implementation of a pharmacist-led transitional pharmaceutical care programme: process evaluation of medication actions to reduce hospital admissions through a collaboration between Community and Hospital pharmacists (MARCH). J Clin Pharm Ther. 2022.

Morris ZS, Wooding S, Grant J. The answer is 17 years, what is the question: understanding time lags in translational research. J R Soc Med. 2011;104(12):510–20.

De Geest S, Zúñiga F, Brunkert T, Deschodt M, Zullig LL, Wyss K, et al. Powering Swiss health care for the future: implementation science to bridge the valley of death. Swiss Med Wkly. 2020;150:w20323.

Noonan VK, Lyddiatt A, Ware P, Jaglal SB, Riopelle RJ, Bingham CO 3, et al. Montreal Accord on patient-reported outcomes (PROs) use series - paper 3: patient-reported outcomes can facilitate shared decision-making and guide self-management. J Clin Epidemiol. 2017;89:125–35.

Hesselink G, Schoonhoven L, Barach P, Spijker A, Gademan P, Kalkman C, et al. Improving patient handovers from hospital to primary care: a systematic review. Ann Intern Med. 2012;157(6):417–28.

(OFSP) Interprofessionnalité dans le domaine de la santé Soins ambulatoire. Accessed 4 January 2024. https://www.bag.admin.ch/bag/fr/home/strategie-und-politik/nationale-gesundheitspolitik/foerderprogramme-der-fachkraefteinitiative-plus/foerderprogramme-interprofessionalitaet.html

Mitchell SE, Laurens V, Weigel GM, Hirschman KB, Scott AM, Nguyen HQ, et al. Care transitions from patient and caregiver perspectives. Ann Fam Med. 2018;16(3):225–31.

Davoody N, Koch S, Krakau I, Hägglund M. Post-discharge stroke patients’ information needs as input to proposing patient-centred eHealth services. BMC Med Inf Decis Mak. 2016;16:66.

Ozavci G, Bucknall T, Woodward-Kron R, Hughes C, Jorm C, Joseph K, et al. A systematic review of older patients’ experiences and perceptions of communication about managing medication across transitions of care. Res Social Adm Pharm. 2021;17(2):273–91.

Fylan B, Armitage G, Naylor D, Blenkinsopp A. A qualitative study of patient involvement in medicines management after hospital discharge: an under-recognised source of systems resilience. BMJ Qual Saf. 2018;27(7):539–46.

Fylan B, Marques I, Ismail H, Breen L, Gardner P, Armitage G, et al. Gaps, traps, bridges and props: a mixed-methods study of resilience in the medicines management system for patients with heart failure at hospital discharge. BMJ Open. 2019;9(2):e023440.

Brandberg C, Ekstedt M, Flink M. Self-management challenges following hospital discharge for patients with multimorbidity: a longitudinal qualitative study of a motivational interviewing intervention. BMJ Open. 2021;11(7):e046896.

Lawton J, Rankin D, Peel E, Douglas M. Patients’ perceptions and experiences of transitions in diabetes care: a longitudinal qualitative study. Health Expect. 2009;12(2):138–48.

Mabire C, Bachnick S, Ausserhofer D, Simon M. Patient readiness for hospital discharge and its relationship to discharge preparation and structural factors: a cross-sectional study. Int J Nurs Stud. 2019;90:13–20.

Meyers DC, Durlak JA, Wandersman A. The quality implementation framework: a synthesis of critical steps in the implementation process. Am J Community Psychol. 2012;50(3–4):462–80.

Meyer-Massetti C, Hofstetter V, Hedinger-Grogg B, Meier CR, Guglielmo BJ. Medication-related problems during transfer from hospital to home care: baseline data from Switzerland. Int J Clin Pharm. 2018;40(6):1614–20.

Neeman M, Dobrinas M, Maurer S, Tagan D, Sautebin A, Blanc AL, et al. Transition of care: a set of pharmaceutical interventions improves hospital discharge prescriptions from an internal medicine ward. Eur J Intern Med. 2017;38:30–7.

Geese F, Schmitt KU. Interprofessional Collaboration in Complex Patient Care Transition: a qualitative multi-perspective analysis. Healthc (Basel). 2023;11(3).

Craig P, Dieppe P, Macintyre S, Michie S, Nazareth I, Petticrew M. Developing and evaluating complex interventions: the new Medical Research Council guidance. Int J Nurs Stud. 2013;50(5):587–92.

Thomson R, Plumridge L, Holland J, Editorial. Int J Soc Res Methodol. 2003;6(3):185–7.

Audulv Å, Hall EOC, Kneck Å, Westergren T, Fegran L, Pedersen MK, et al. Qualitative longitudinal research in health research: a method study. BMC Med Res Methodol. 2022;22(1):255.

Kim H, Sefcik JS, Bradway C. Characteristics of qualitative descriptive studies: a systematic review. Res Nurs Health. 2017;40(1):23–42.

Sandelowski M. Whatever happened to qualitative description? Res Nurs Health. 2000;23(4):334–40.

Bradshaw C, Atkinson S, Doody O. Employing a qualitative description Approach in Health Care Research. Glob Qual Nurs Res. 2017;4:2333393617742282.

PubMed   PubMed Central   Google Scholar  

Bellone JM, Barner JC, Lopez DA. Postdischarge interventions by pharmacists and impact on hospital readmission rates. J Am Pharm Assoc (2003). 2012;52(3):358–62.

Hennink MM, Kaiser BN, Marconi VC. Code saturation versus meaning saturation: how many interviews are Enough? Qual Health Res. 2016;27(4):591–608.

World Health Organization. Adherence to long-term therapies: evidence for action. 2003.

Fisher JD, Fisher WA, Amico KR, Harman JJ. An information-motivation-behavioral skills model of adherence to antiretroviral therapy. Health Psychol. 2006;25(4):462–73.

Bandura A. Health promotion from the perspective of social cognitive theory. Psychol Health. 1998;13(4):623–49.

ShiftEUROQOL Research FOndation EQ 5D Instruments. Accessed 30 July 2022 https://euroqol.org/eq-5d-instruments/sample-demo/

Jeppesen KM, Coyle JD, Miser WF. Screening questions to predict limited health literacy: a cross-sectional study of patients with diabetes mellitus. Ann Fam Med. 2009;7(1):24–31.

Giordano TP, Guzman D, Clark R, Charlebois ED, Bangsberg DR. Measuring adherence to antiretroviral therapy in a diverse population using a visual analogue scale. HIV Clin Trials. 2004;5(2):74–9.

Horne R, Weinman J, Hankins M. The beliefs about medicines questionnaire: the development and evaluation of a new method for assessing the cognitive representation of medication. Psychol Health. 1999;14(1):1–24.

Horne R, Chapman SC, Parham R, Freemantle N, Forbes A, Cooper V. Understanding patients’ adherence-related beliefs about medicines prescribed for long-term conditions: a meta-analytic review of the necessity-concerns Framework. PLoS ONE. 2013;8(12):e80633.

Braun V, Clarke V. Using thematic analysis in psychology. Qualitative Res Psychol. 2006;3(2):77–101.

Waibel S, Henao D, Aller M-B, Vargas I, Vázquez M-L. What do we know about patients’ perceptions of continuity of care? A meta-synthesis of qualitative studies. Int J Qual Health Care. 2011;24(1):39–48.

Rognan SE, Jørgensen MJ, Mathiesen L, Druedahl LC, Lie HB, Bengtsson K, et al. The way you talk, do I have a choice?’ Patient narratives of medication decision-making during hospitalization. Int J Qualitative Stud Health Well-being. 2023;18(1):2250084.

Michel B, Hemery M, Rybarczyk-Vigouret MC, Wehrle P, Beck M. Drug-dispensing problems community pharmacists face when patients are discharged from hospitals: a study about 537 prescriptions in Alsace. Int J Qual Health Care. 2016;28(6):779–84.

Bruhwiler LD, Hersberger KE, Lutters M. Hospital discharge: what are the problems, information needs and objectives of community pharmacists? A mixed method approach. Pharm Pract (Granada). 2017;15(3):1046.

Knight DA, Thompson D, Mathie E, Dickinson A. Seamless care? Just a list would have helped!’ Older people and their carer’s experiences of support with medication on discharge home from hospital. Health Expect. 2013;16(3):277–91.

Gualandi R, Masella C, Viglione D, Tartaglini D. Exploring the hospital patient journey: what does the patient experience? PLoS ONE. 2019;14(12):e0224899.

Norberg H, Håkansson Lindqvist M, Gustafsson M. Older individuals’ experiences of Medication Management and Care after Discharge from Hospital: an interview study. Patient Prefer Adherence. 2023;17:781–92.

Jones KC, Austad K, Silver S, Cordova-Ramos EG, Fantasia KL, Perez DC, et al. Patient perspectives of the hospital discharge process: a qualitative study. J Patient Exp. 2023;10:23743735231171564.

Hesselink G, Flink M, Olsson M, Barach P, Dudzik-Urbaniak E, Orrego C, et al. Are patients discharged with care? A qualitative study of perceptions and experiences of patients, family members and care providers. BMJ Qual Saf. 2012;21(Suppl 1):i39–49.

Murray SA, Kendall M, Carduff E, Worth A, Harris FM, Lloyd A, et al. Use of serial qualitative interviews to understand patients’ evolving experiences and needs. BMJ. 2009;339:b3702.

Berger ZD, Boss EF, Beach MC. Communication behaviors and patient autonomy in hospital care: a qualitative study. Patient Educ Couns. 2017;100(8):1473–81.

Davis RE, Jacklin R, Sevdalis N, Vincent CA. Patient involvement in patient safety: what factors influence patient participation and engagement? Health Expect. 2007;10(3):259–67.

Greene J, Hibbard JH. Why does patient activation matter? An examination of the relationships between patient activation and health-related outcomes. J Gen Intern Med. 2012;27(5):520–6.

Mitchell SE, Gardiner PM, Sadikova E, Martin JM, Jack BW, Hibbard JH, et al. Patient activation and 30-day post-discharge hospital utilization. J Gen Intern Med. 2014;29(2):349–55.

Weir DL, Motulsky A, Abrahamowicz M, Lee TC, Morgan S, Buckeridge DL, et al. Failure to follow medication changes made at hospital discharge is associated with adverse events in 30 days. Health Serv Res. 2020;55(4):512–23.

Kripalani S, Goggins K, Nwosu S, Schildcrout J, Mixon AS, McNaughton C, et al. Medication nonadherence before hospitalization for Acute Cardiac events. J Health Commun. 2015;20(Suppl 2):34–42.

Mortelmans L, De Baetselier E, Goossens E, Dilles T. What happens after Hospital Discharge? Deficiencies in Medication Management encountered by geriatric patients with polypharmacy. Int J Environ Res Public Health. 2021;18(13).

Mortelmans L, Goossens E, Dilles T. Beliefs about medication after hospital discharge in geriatric patients with polypharmacy. Geriatr Nurs. 2022;43:280–7.

Bandiera C, Ribaut J, Dima AL, Allemann SS, Molesworth K, Kalumiya K et al. Swiss Priority setting on implementing Medication Adherence interventions as Part of the European ENABLE COST action. Int J Public Health. 2022;67.

Elliott R, Boyd M, Nde S. at e. Supporting adherence for people starting a new medication for a long-term condition through community pharmacies: a pragmaticrandomised controlled trial of the New Medicine Service. 2015.

Grimmsmann T, Schwabe U, Himmel W. The influence of hospitalisation on drug prescription in primary care–a large-scale follow-up study. Eur J Clin Pharmacol. 2007;63(8):783–90.

Larsen MD, Rosholm JU, Hallas J. The influence of comprehensive geriatric assessment on drug therapy in elderly patients. Eur J Clin Pharmacol. 2014;70(2):233–9.

Viktil KK, Blix HS, Eek AK, Davies MN, Moger TA, Reikvam A. How are drug regimen changes during hospitalisation handled after discharge: a cohort study. BMJ Open. 2012;2(6):e001461.

Strehlau AG, Larsen MD, Søndergaard J, Almarsdóttir AB, Rosholm J-U. General practitioners’ continuation and acceptance of medication changes at sectorial transitions of geriatric patients - a qualitative interview study. BMC Fam Pract. 2018;19(1):168.

Anderson TS, Lee S, Jing B, Fung K, Ngo S, Silvestrini M, et al. Prevalence of diabetes medication intensifications in older adults discharged from US Veterans Health Administration Hospitals. JAMA Netw Open. 2020;3(3):e201511.

Royal Pharmaceutical Society. Keeping patients safewhen they transfer between care providers– getting the medicines right June 2012. Accessed 27 October 2023 https://www.rpharms.com/Portals/0/RPS%20document%20library/Open%20access/Publications/Keeping%20patients%20safe%20transfer%20of%20care%20report.pdf

International Pharmaceutical Federation (FIP). Medicines reconciliation: A toolkit for pharmacists. Accessed 23 September 2023 https://www.fip.org/file/4949

Californian Pharmacist Assiociation Transitions of Care Resource Guide. https://cdn.ymaws.com/www.cshp.org/resource/resmgr/Files/Practice-Policy/For_Pharmacists/transitions_of_care_final_10.pdf

Royal Collegue of Physicians. Medication safety at hospital discharge: Improvement guide and resource. Accessed 18 September 2023 https://www.rcplondon.ac.uk/file/33421/download

Douglas N, Campbell W, Hinckley J. Implementation science: buzzword or game changer. J Speech Lang Hear Res. 2015;58.

Download references

Acknowledgements

The authors would like to thank all the patients who took part in this study. We would also like to thank the Geneva University Hospitals Patients Partners + 3P platform as well as Mrs. Tourane Corbière and Mr. Joël Mermoud, patient partners, who reviewed interview guides for clarity and significance. We would like to thank Samuel Fabbi, Vitcoryavarman Koh, and Pierre Repiton for the transcriptions of the audio recordings.

This research did not receive any specific grant from funding agencies in the public, commercial, or not-for-profit sectors.

Open access funding provided by University of Geneva

Author information

Authors and affiliations.

School of Pharmaceutical Sciences, University of Geneva, Geneva, Switzerland

Léa Solh Dost & Marie P. Schneider

Institute of Pharmaceutical Sciences of Western Switzerland, University of Geneva, Geneva, Switzerland

Division of Endocrinology, Diabetes, Hypertension and Nutrition, Department of Medicine, Geneva University Hospitals, Geneva, Switzerland

Giacomo Gastaldi

You can also search for this author in PubMed   Google Scholar

Contributions

LS, GG, and MS conceptualized and designed the study. LS and GG screened and recruited participants. LS conducted the interviews. LS, GG, and MS performed data analysis and interpretation. LS drafted the manuscript and LS and MS worked on the different versions. MS and GG approved the final manuscript.

Corresponding authors

Correspondence to Léa Solh Dost or Marie P. Schneider .

Ethics declarations

Ethics approval and consent to participate.

Ethics approval was sought and granted by the Cantonal Research Ethics Commission, Geneva (CCER) (2020 − 01779), and informed consent to participate was obtained from all participants.

Consent for publication

Informed consent for publication was obtained from all participants.

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary Material 1

Supplementary material 2, supplementary material 3, rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Cite this article.

Solh Dost, L., Gastaldi, G. & Schneider, M. Patient medication management, understanding and adherence during the transition from hospital to outpatient care - a qualitative longitudinal study in polymorbid patients with type 2 diabetes. BMC Health Serv Res 24 , 620 (2024). https://doi.org/10.1186/s12913-024-10784-9

Download citation

Received : 28 June 2023

Accepted : 26 February 2024

Published : 13 May 2024

DOI : https://doi.org/10.1186/s12913-024-10784-9

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Continuity of care
  • Transition of care
  • Patient discharge
  • Medication management
  • Medication adherence
  • Qualitative research
  • Longitudinal studies
  • Patient-centered care
  • Interprofessional collaboration
  • Type 2 diabetes

BMC Health Services Research

ISSN: 1472-6963

what is data analysis procedures for qualitative research

IMAGES

  1. Qualitative Data Analysis: Step-by-Step Guide (Manual vs. Automatic

    what is data analysis procedures for qualitative research

  2. 5 Steps of the Data Analysis Process

    what is data analysis procedures for qualitative research

  3. The process of Qualitative Data Analysis includes analysing information

    what is data analysis procedures for qualitative research

  4. Data Analysis Techniques In Qualitative Research

    what is data analysis procedures for qualitative research

  5. Qualitative Data Analysis Methods And Techniques

    what is data analysis procedures for qualitative research

  6. Your Guide to Qualitative and Quantitative Data Analysis Methods

    what is data analysis procedures for qualitative research

VIDEO

  1. Analysis of Data? Some Examples to Explore

  2. Qualitative Research (Data Analysis and Interpretation) Video Lesson

  3. Qualitative Data Analysis Procedures in Linguistics

  4. Qualitative data analysis in NVivo

  5. Session 04: Data Analysis techniques in Qualitative Research

  6. Applied Research (quantitative data analysis)

COMMENTS

  1. Qualitative Data Analysis: Step-by-Step Guide (Manual vs ...

    Qualitative Data Analysis methods. Once all the data has been captured, there are a variety of analysis techniques available and the choice is determined by your specific research objectives and the kind of data you've gathered. Common qualitative data analysis methods include: Content Analysis. This is a popular approach to qualitative data ...

  2. Learning to Do Qualitative Data Analysis: A Starting Point

    For many researchers unfamiliar with qualitative research, determining how to conduct qualitative analyses is often quite challenging. Part of this challenge is due to the seemingly limitless approaches that a qualitative researcher might leverage, as well as simply learning to think like a qualitative researcher when analyzing data. From framework analysis (Ritchie & Spencer, 1994) to content ...

  3. Qualitative Data Analysis: What is it, Methods + Examples

    Qualitative data analysis is a systematic process of examining non-numerical data to extract meaning, patterns, and insights. In contrast to quantitative analysis, which focuses on numbers and statistical metrics, the qualitative study focuses on the qualitative aspects of data, such as text, images, audio, and videos.

  4. Qualitative Data Analysis Methods: Top 6 + Examples

    QDA Method #1: Qualitative Content Analysis. Content analysis is possibly the most common and straightforward QDA method. At the simplest level, content analysis is used to evaluate patterns within a piece of content (for example, words, phrases or images) or across multiple pieces of content or sources of communication. For example, a collection of newspaper articles or political speeches.

  5. PDF The SAGE Handbook of Qualitative Data Analysis

    The SAGE Handbook of. Qualitative Data Analysis. Uwe Flick. 00-Flick-Prelims.indd 5 29-Oct-13 2:00:39 PM. Data analysis is the central step in qualitative research. Whatever the data are, it is their analysis that, in a decisive way, forms the outcomes of the research. Sometimes, data collection is limited to recording and docu- menting ...

  6. How to Analyze Qualitative Data?

    Qualitative data analysis is an important part of research and building greater understanding across fields for a number of reasons. First, cases for qualitative data analysis can be selected purposefully according to whether they typify certain characteristics or contextual locations. In other words, qualitative data permits deep immersion into a topic, phenomenon, or area of interest.

  7. What Is Qualitative Research?

    Qualitative research involves collecting and analyzing non-numerical data (e.g., text, video, or audio) to understand concepts, opinions, or experiences. It can be used to gather in-depth insights into a problem or generate new ideas for research. Qualitative research is the opposite of quantitative research, which involves collecting and ...

  8. Data Analysis for Qualitative Research: 6 Step Guide

    How to analyze qualitative data from an interview. To analyze qualitative data from an interview, follow the same 6 steps for quantitative data analysis: Perform the interviews. Transcribe the interviews onto paper. Decide whether to either code analytical data (open, axial, selective), analyze word frequencies, or both.

  9. Learning to Do Qualitative Data Analysis: A Starting Point

    In this article, we take up this open question as a point of departure and offer the-matic analysis, an analytic method commonly used to identify patterns across lan-guage-based data (Braun & Clarke, 2006), as a useful starting point for learning about the qualitative analysis process.

  10. PDF Qualitative Data Analysis

    grams for qualitative data analysis; you will see that these increasingly popular programs are blurring the distinctions between quantitative and qualitative approaches to textual analysis. 22 Features of Qualitative Data Analysis The distinctive features of qualitative data collection methods that you studied in Chapter 9 are also reflected

  11. PDF A Step-by-Step Guide to Qualitative Data Analysis

    Step 1: Organizing the Data. "Valid analysis is immensely aided by data displays that are focused enough to permit viewing of a full data set in one location and are systematically arranged to answer the research question at hand." (Huberman and Miles, 1994, p. 432) The best way to organize your data is to go back to your interview guide.

  12. Qualitative Data Analysis

    Qualitative data coding . Step 2: Identifying themes, patterns and relationships.Unlike quantitative methods, in qualitative data analysis there are no universally applicable techniques that can be applied to generate findings.Analytical and critical thinking skills of researcher plays significant role in data analysis in qualitative studies.

  13. Data Analysis in Qualitative Research: A Brief Guide to Using Nvivo

    In some cases, qualitative data can also include pictorial display, audio or video clips (e.g. audio and visual recordings of patients, radiology film, and surgery videos), or other multimedia materials. Data analysis is the part of qualitative research that most distinctively differentiates from quantitative research methods.

  14. How to use and assess qualitative research methods

    For data analysis, field-notes and audio-recordings are transcribed into protocols and transcripts, and coded using qualitative data management software. Criteria such as checklists, reflexivity, sampling strategies, piloting, co-coding, member-checking and stakeholder involvement can be used to enhance and assess the quality of the research ...

  15. Qualitative Research: Data Collection, Analysis, and Management

    Doing qualitative research is not easy and may require a complete rethink of how research is conducted, particularly for researchers who are more familiar with quantitative approaches. There are many ways of conducting qualitative research, and this paper has covered some of the practical issues regarding data collection, analysis, and management.

  16. Data Analysis in Research: Types & Methods

    Methods used for data analysis in qualitative research. There are several techniques to analyze the data in qualitative research, but here are some commonly used methods, Content Analysis: It is widely accepted and the most frequently employed technique for data analysis in research methodology. It can be used to analyze the documented ...

  17. (PDF) Qualitative Data Analysis and Interpretation: Systematic Search

    Qualitative data analysis is. concerned with transforming raw data by searching, evaluating, recogni sing, cod ing, mapping, exploring and describing patterns, trends, themes an d categories in ...

  18. Qualitative data analysis: a practical example

    The aim of this paper is to equip readers with an understanding of the principles of qualitative data analysis and offer a practical example of how analysis might be undertaken in an interview-based study. Qualitative research is a generic term that refers to a group of methods, and ways of collecting and analysing data that are interpretative or explanatory in nature and focus on meaning ...

  19. A Practical Iterative Framework for Qualitative Data Analysis

    16) undoubtedly presents the qualitative researcher with an unwanted realization. Nonetheless, there are many well-known texts outlining general procedures for conducting qualitative data analysis (e.g., Lincoln & Guba, 1985; Miles & Huberman, 1994; Patton, 2002; Strauss & Corbin, 1998). We do not intend to provide an in-depth discussion on ...

  20. (PDF) Data Analysis Methods for Qualitative Research: Managing the

    Thematic analysis is a method of data analysis in qualitative research that most researchers use, and it is flexible because it can be applied and utilized broadly across various epistemologies ...

  21. 5 Qualitative Data Analysis Methods to Reveal User Insights

    5 qualitative data analysis methods explained. Qualitative data analysis is the process of organizing, analyzing, and interpreting qualitative research data—non-numeric, conceptual information, and user feedback—to capture themes and patterns, answer research questions, and identify actions to improve your product or website.Step 1 in the research process (after planning) is qualitative ...

  22. What is data analysis? Methods, techniques, types & how-to

    Qualitative data analysis methods are defined as the observation of non-numerical data that is gathered and produced using methods of observation such as interviews, focus groups, questionnaires, and more. ... Analysis in qualitative research has by default additional subjective influences that must be controlled in a different way. Therefore ...

  23. Data Collection

    Data collection is a systematic process of gathering observations or measurements. Whether you are performing research for business, governmental or academic purposes, data collection allows you to gain first-hand knowledge and original insights into your research problem. While methods and aims may differ between fields, the overall process of ...

  24. Qualitative Data Analysis Methodologies and Methods

    Types of Qualitative Data Analysis Methodologies. Systematically analyzing textual, visual, or auditory content to identify patterns, themes, and meanings. Includes conventional, directed, and summative approaches. Identifying, analyzing, and reporting patterns or themes within qualitative data. Offers a systematic approach to coding and ...

  25. What is Qualitative Data Analysis Software (QDA Software)?

    Published: Oct. 23, 2023. Qualitative Data Analysis Software (QDA software) allows researchers to organize, analyze and visualize their data, finding the patterns in qualitative data or unstructured data: interviews, surveys, field notes, videos, audio files, images, journal articles interviews, web content etc.

  26. MCHRI Qualitative Research Workshop: Overview of Qualitative Data

    Description: The workshop will review qualitative research data collection methods, specifically reviewing qualities of interview and focus group methods. Presenters will discuss data collection best practices, tips for developing data collection materials such as interview guides, and how to prepare for qualitative analysis.

  27. What is Qualitative Data Analysis?

    Understanding Qualitative Data Analysis. Qualitative data analysis is the process of systematically examining and deciphering qualitative facts (such as textual content, pix, motion pictures, or observations) to discover patterns, themes, and meanings inside the statistics· Unlike quantitative statistics evaluation, which focuses on numerical measurements and statistical strategies ...

  28. Exploring the perspective of adolescent ...

    2.4 Data analysis. Principles of qualitative content analysis according to Kuckartz were applied using a deductive-inductive procedure that was data driven and iterative. 23 The qualitative content analysis software MAXQDA (2022) was used. Initial key categories were generated deductively based on the explorative research questions, the ...

  29. What are the strengths and limitations to utilising creative methods in

    There is increasing interest in using patient and public involvement (PPI) in research to improve the quality of healthcare. Ordinarily, traditional methods have been used such as interviews or focus groups. However, these methods tend to engage a similar demographic of people. Thus, creative methods are being developed to involve patients for whom traditional methods are inaccessible or non ...

  30. Patient medication management, understanding and adherence during the

    Study design. This qualitative longitudinal study, conducted from October 2020 to July 2021, used a qualitative descriptive methodology through four consecutive in-depth semi-structured interviews per participant at three, 10-, 30- and 60-days post-discharge, as illustrated in Fig. 1.Longitudinal qualitative research is characterized by qualitative data collection at different points in time ...