Introduction: The era in which we exist is a lifetime where diversity is promoted but still lacks complete representation. The media we see today often walks a fine line between diversity and the display of it– while they are glorified for their flair they are more often than not abandoned by industry structures and major marketing, this results in the Latinx community being treated as “niche”. Now the question we pose today is what will it take for the Latinx community to break through the surface and finally reach the audiences they have been working so hard for? In this interview, we met with Dr. Henry Puente to speak about his 2012 research study “Marketing and Distribution Lessons from Hispanic Hollywood.”
By: Valentina Castillo, Samantha Hernandez, Jennifer Zavala
Dr. Waleed Rashidi, Department of Communications
Intro
Our group decided to interview Dr. Waleed Rashidi, an associate professor in the Communications department. Dr. Rashid’s primary research interest is music as a form of mass communication through media and technology experiences. Dr. Rashidi shared his journey through communication research. He attended Cal State Fullerton to receive his master’s and later got his doctorate from ULV. Rashidi, before teaching, was the editor-in-chief of Mean Street Magazine. His interest in music-focused research was sparked by his personal experience and shared passion for music and the industry, which led to a curiosity about music as a form of mass communication.
Q1: What is the main focus of your research?
The main focus of Dr. Rashidi’s research explores the fascinating intersection of mass communication and music, with a special focus on how people engage with music in their daily lives. From vinyl records and cassette tapes to CDs and streaming platforms, Dr. Rashid dives into the different ways people experience and collect music. His work also touches on music technology and the evolving world of music journalism, offering a broad look at how music continues to shape and reflect our culture.
Q2: How did you get started in research, and what made you focus on music more specifically?
Dr. Rashidi’s journey into research began during his time as a grad student at Cal State Fullerton, where he completed a master’s degree in communications nearly 20 years ago. Inspired by supportive professors and engaging coursework, he gradually found footing in the world of academic research. Unsure of his passion, he found connecting with professors and peers helped him explore different topics, eventually uncovering his love for music. However, before pursuing a master’s degree, he had already built a solid foundation in the music industry. After finishing his bachelor’s degree, his first professional role was as the editor-in-chief of a music magazine based in SoCal. In this role, he oversaw all editorial operations, from choosing which bands to feature to shaping the magazine’s overall voice. Initially, while in grad school, he was unsure if music fit within the scope of communications research, but his growing involvement in the program helped him see the connection. Deciding to merge his industry experience with his academics, he started his passion for music. His research now bridges mass communication and music, drawing from both professional insight and scholarly exploration.
Q3: What methods of research do you use most frequently? Which method engages the most participation/ results?
When it comes to research methods, his approach has evolved. Early on, He leaned heavily on in-depth interviews—an approach that came naturally due to their background in music journalism, where interviewing artists daily was part of the job. This skill translated seamlessly into his academic work, allowing him to dive deep into people’s experiences with music and media.
More recently, he expanded his methods to include online surveys and questionnaires, allowing them to reach broader audiences and gather different types of data. He’s also delved into content analysis, especially of music journalism, examining patterns and themes across news stories and articles. This includes coding and rounds of review to identify recurring narratives. Ultimately, his choice of methods depends on the nature of the research, blending qualitative and quantitative techniques to explore the many ways people interact with music and media.
Q4: What are some initial questions you ask yourself to decide if a topic is worth researching? What are some challenges that arise?
When it comes to choosing a research topic, one of the biggest challenges for Dr Rasidi is finding something that truly feels unique, something that hasn’t already been thoroughly explored. The first question he always asked is, “Have I seen this before?” He’ll scan existing research to see if there’s a gap or a niche that hasn’t been filled, aiming to contribute something fresh to the field. If a topic feels overdone, or if someone else has already published something similar, he keeps searching for that one idea that hasn’t been fully explored yet.
Of course, that process isn’t always easy. There are moments of frustration, times when a great idea turns out to already exist, or when he comes across research he wishes he had thought of first. But beyond the search for originality, there’s also a personal element: the topic has to be something he’s willing to live with for a long time. Research takes weeks, sometimes months, so it needs to be something he’s deeply interested in and connected to, something worth investing in from start to finish.
Time is another major challenge when it comes to research. Deadlines for journals and conferences can sneak up fast, and he often finds himself racing the clock to get everything done in time. It’s not just about finishing the work, it’s about making sure it’s good work, and that takes planning. He’s learned the key is to work backward from the deadline, mapping out milestone moments along the way, four months out, three months out, and so on. That way, he can pace himself and avoid a last-minute scramble. Still, even with careful planning, time pressure can be intense. Balancing quality research with tight deadlines is a constant juggling act, and it’s one of the more stressful parts of the process.
Q5: Have you ever had to scrap or rethink a project entirely?
Scrapping or rethinking research projects is something he’s experienced, and more than once. Sometimes, an idea just doesn’t take off. Maybe there’s not enough existing information to build from, or he can’t find the right participants or enough of them to carry out the study effectively. That’s one of the more common challenges: having a strong concept, but realizing it’s too narrow or specific, making it tough to recruit people who fit the criteria.
There have also been times when the scope of a project was simply too ambitious or the timing didn’t work out, forcing him to scale back or go back to the drawing board entirely. These setbacks are part of the process, though. As frustrating as they can be, they often lead to a clearer, more realistic direction and sometimes even better ideas down the line.
Q6: How do you decide which sources or data to prioritize?
He has to. It very depends on the individual project. There isn’t necessarily a “go-to situation” for what he is going to prioritize for whatever study he is doing. You have to see how a project is going to lay out, how it will play out the next few months/weeks, and then determine where your priorities are and where he will be spending most and less of his time. It’s hard to say “Well, this is what I always do” because it doesn’t work that way. There’s variation depending on this.
citations:
Lindquist, M. J. (n.d.). Unit Two, part one: History of audio recording. Audio Production Course Manual.
McCarthy , N. (n.d.). Chart: Music streaming revenues overtake CD sales in the U.S. | statista.
Q7: When it comes to conducting research, do you typically collaborate with others or mostly work independently, and which do you prefer?
Dr. Rashidi enjoys working on his own, which is different. Most of his colleagues like working in teams and collaborating with others, but he has always been good as an independent researcher. The only time he teams up with other people is when he has grad students who help him with his research. Grad students will help him look things up and get information for the literature review, but that has been the extent of which he has collaboration with other people. He likes to be a solo author, an independent researcher.
He also feels as though it is less challenging working on his own. Everything that he does is on his timeline. He doesn’t have to wait on others to complete their part before moving forward. There is also the challenge that could be different schedules or others having different projects that can create differences in priority. This project that they are working on together can be low priority for the other, while it’s higher priority for Rashidi. This can then create delays in getting things done. At least when he works on his own, he is responsible for failures or not getting things done on time.
Q8: What do you enjoy most about doing research, and do you believe you’ve made an impact?
Just the exploration of it all. He enjoys how he can explore something interesting, unique, and fascinating to him, and that might be of interest to other people. He feels that if he can contribute to that body of work on the topic that is already available, he can bring something unique and different to the landscape that is already out there. It is beneficial for himself but also others.
He hopes that he has made an impact with the research that he has done. There have been books that have quoted his research. His research has also been published in books, as well. Some students’ master’s theses or for doctoral dissertations, where they used him as a source or cited his work within their research. He would hope to think that what he is doing has benefited or has had an impact. It’s always hard to say because he doesn’t always know where his research lands, but he hopes that people read it and get inspired to learn from it or do their own research from the ideas that he presented.
Q9. What advice would you give someone interested in research?
The first thing Rashidi advises anyone interested in research is to have patience. Good research takes time, and you need patience to make sure you’re doing things properly and things are going in a methodical fashion. You also have to be very methodical about how you operate, as well. Do things step by step, schedule things out, and have a lot of patience. Be willing to make modifications as you go along because sometimes what you originally set out to do, you find out that it’s not exactly what you anticipate or expect, and that’s ok. You have to be able to take those tangents or go in a slightly different direction, and you might end up having better outcomes.
Q10: Most challenging part about doing research?
Sometimes it can be hard when you have a great idea, and then you start searching around and find out someone already took that. There are also times when he sees research out there and just thinks, “I wish that was me. I wish I thought of that idea!” Overall, it can be challenging in trying to find a really good topic that you find unique and different, but also something that you feel other people will be interested in reading and learning about, and exploring. Rashidi also believes that you should feel connected to the research that you do. You have to think about how you are going to be working on a topic for a fairly good amount of time, so think about really picking a topic that you know you will feel engaged in and willing to invest time in.
Another big challenge can just be the amount of time that is put into research. Rashidi feels as though he is always “Racing the clock all the time” when it comes to research that he is putting together. There are deadlines that journals and conferences have for when to submit your papers. There are times when he tries to finish things in a timely manner, and hopefully he has enough time to put everything together.
Conclusion
Concluding the interview with Dr. Rashidi, it left our team with a valuable and inspiring look into the academic research world, more particularly the connection between music and mass communication. Rashidis’ journey from doing music journalism to becoming a professor and researcher showed the importance of researching and the challenges that come along with it. He gave us knowledge on how to conduct research and what steps to take to have effective results. My team and I gained insight into the many methods of research and the importance of originality, along with the reality of time management and collaborations. He also emphasized the qualities we need to have to move forward, some being patience, organization, and passion. He emphasised the significance of researching topics that spark genuine interest. This interview has left my team and i with a clear understanding of what it takes to conduct meaningful research and the qualities to have to thrive.
Dr. Elise Assaf is an assistant professor of communications, where she teaches courses in public relations and entertainment. She holds a PhD in Education from Chapman University and earned both her MA and BA in Communications from Cal State Fullerton.
In her dissertation, Hidden Power: Journalistic Representations of Mental Health Labels, Dr. Assaf examined how The New York Times, The Washington Post, and USA Today portray mental health. She used a qualitative case study approach with Critical Discourse Analysis to explore how journalistic language influences public understanding by reinforcing stereotypes and simplifying complex issues.
In our interview, she shared the personal and academic journey behind her research, along with insights on mental health, media framing, and the role of language in shaping perception.
Q: What initially sparked your interest in researching mental health representations in media, and how has that inspiration evolved throughout your research journey?
Dr. Assaf’s interest began with personal experience, especially the responses she witnessed to her brother’s schizophrenia diagnosis. She grew curious about how media shapes those reactions, noticing a pattern of repeated stereotypes and an emphasis on diagnosis rather than the full person. Rather than focusing on how audiences interpret these messages, she chose to examine how news outlets construct them.
Q: How did you approach balancing your personal perspectives with objectivity throughout the research process?
She believes that acknowledging personal bias is an important part of qualitative research. Rather than trying to remove it, she reflected on how her experiences shaped her analysis. She also collaborated with peers to compare interpretations and ensure the analysis remained grounded and open to other perspectives.
Q: What advice would you give to emerging researchers who want to explore media representation and discourse studies? What challenges should they anticipate?
Dr. Assaf advises choosing a topic you care deeply about. Research can be long and challenging, so genuine interest helps sustain the work. She emphasizes the value of contributing something meaningful to the field, whether by exploring new topics or revisiting older studies with updated perspectives. She encourages researchers to identify gaps in the literature and focus on producing work that informs and sparks conversation.
Q: How do you see your research contributing to broader discussions about mental health representation, especially within news media?
She views media as a powerful space for learning. Dr. Assaf sees her work as a way to connect communication and education by exploring how news stories shape what people know and believe about mental health. These stories do more than report facts. They influence how people talk about mental health and how it is understood in everyday life. Her goal is to help people become more aware of the messages they consume and more thoughtful about the assumptions those messages carry.
Q: You applied Critical Discourse Analysis to articles from The Washington Post, The New York Times, and USA Today. Would you expect different results if you applied your analysis to other major publications or non-traditional news sources?
She believes similar results would likely appear in other national newspapers because they serve similar audiences. However, different patterns might emerge in international outlets, local publications, or media with strong political leanings. She originally hoped to include more geographic diversity in her sample but revised her plan due to a conflict of interest.
Q: Were there any significant differences in how each of the three publications framed mental health? If so, what do you believe contributed to those differences?
Dr. Assaf found no major differences among the newspapers she studied. Each one used similar language and relied on comparable sources. Since they are all written for wide national audiences, their tone and framing were mostly aligned. While this consistency can be helpful for clarity, she noted that it can also limit the diversity of perspectives included in coverage.
Q: Your analysis identified patterns and trends in the linguistic choices used to describe mental health. Were there any specific words or phrases that stood out as particularly problematic or beneficial in shaping public perception?
She did not identify one specific phrase, but what stood out to her was the lack of voices from people with lived experience. Most of the reporting relied on third-party sources, such as officials or clinicians, which can affect credibility and create distance from the subject. She also noted that when stereotypes are repeated across multiple articles, they can quietly shape public attitudes in powerful ways.
Source: Washington Post
Source: Washington Post
Q: What would you change or expand upon if you were to conduct this study again with a broader or different dataset? Would you consider including social media or alternative news sources?
Dr. Assaf would consider including social media and independent news platforms, though she recognizes the challenges involved. Social media content is less stable, often more difficult to verify, and subject to manipulation. She also expressed interest in studying local publications, but found that many smaller outlets are owned by larger companies and often recycle national stories.
Q: What limitations did you encounter in applying Critical Discourse Analysis to your dataset, and how would you address them in future research?
She explained that every method has its limits. Her analysis focused only on article text and did not include visuals, reader comments, or other digital elements. She also worked with a relatively small sample size in order to explore each article in depth. She believes future studies could expand on this by analyzing a broader range of content from more diverse sources.
Q: Considering your findings, what recommendations would you make to journalists or media outlets to improve their coverage of mental health topics?
She encourages journalists to seek out voices from people with lived experience. These stories are often overlooked or underrepresented, yet they provide important insight. She acknowledges that reaching these individuals can be difficult because of privacy and stigma, but believes the effort is worth it. Including more direct accounts can create more accurate, human-centered reporting and help shift the narrative away from overly clinical or institutional viewpoints.
Q: Were there any findings that particularly surprised you or contradicted your initial assumptions?
Yes. One thing that surprised her was the frequent appearance of first responders, such as police officers and firefighters, in mental health stories. In many cases, they were included even when they had no direct involvement. Dr. Assaf expected to see more quotes from family members or people with personal ties to those in the stories. This made her question why authority figures are so often treated as default experts on mental health, even though their experience in this area may be limited.
Conclusion
Dr. Assaf’s work reminds us that the way we talk about mental health matters. By examining how media frames these conversations, her research encourages us to become more thoughtful consumers of news and more intentional storytellers in our own lives.
Professor Cynthia King, Ph.D. Courtesy of the California State University, Fullerton Communications Department
The communication professor we interviewed was Dr. Cynthia King. She studied communications and earned her PHD from the University of Alabama, studying under the renowned Dr. Zillman and Dr. Bryant. We held a Zoom meeting with her to gain insight on her experiment, “Effects of Humorous Heroes and Villains in Violent Action Films.” We chose this particular experiment because Dr. King looked at media effects in this study, and our research project aligns with the same goals we are doing within our own research for Comm 410. In this interview, Dr. King explained how she designed the experiment, the different variables she looked at when conducting her research, and how this affects the audience’s view of media.
The conversation with Robert Meeds discussing his study he co-authored with one of his former master’s students at Kansas State, dove into the psychological impact of internet advertisements and how consumers respond to them emotionally and behaviorally. Whether you’re reading an article, watching a video, or browsing online, users are forced to physically stop what they are doing to manually close the ad and create a sense of inconvenience, which nobody likes. Although it’s a small action, it becomes a repeated annoyance that builds negative associations with both the ad itself and the brand it represents. Disruption is key to understanding whether something like pop up ads are effective or ineffective. They not only interrupt but also require effort from the user, making them feel imposed rather than engaged. The study was primarily led by the master’s student, focusing on how viewers react to pop-up ads, examining whether these interruptions influence attitudes toward the product or brand. They built a theoretical framework exploring psychological reactions like frustration and avoidance. Interestingly the results showed that most participants shared similar negative reactions, with frustration being the most common. This emotional response often overshadowed any positive brand messaging, reinforcing the idea that placement and execution heavily impact ad effectiveness. One of the most telling findings was that the act of manually closing a pop-up significantly contributed to users’ irritation. Despite the evolution of digital platforms, he noted that research hasn’t shown advertising to be significantly more effective today than in the past. Instead the field continues to focus on how specific persuasive techniques work with particular audiences and media types. This conversation highlighted how strategic execution and user psychology play central roles in ad effectiveness and how essential it is for advertisers to consider the emotional responses their content might provoke
Doing a deeper dive, there are a few key concepts that are crucial to the study. First, the usage of forced exposure is a major component to the study. Forced exposure was used in the ads from the experiment to disrupt the flow of internet usage. The idea was to see if using this method of ad exposure would lead to negative memories about a brand. Unforced ads are also used as a measurement against forced ads in terms of best delivery for branding. The next key concept that is important to the study was Psychological Reactance, which was touched on broadly by Robert Meeds. It’s the reaction people make when something is impeding their freedom. In this case, the study is trying to find out if the reaction from forced and unforced exposure of ads contributes to brand memory. The conclusion of the study proved what the initial question was asking: whether or not forced exposure causes negative reactions and worse brand memory than non-forced ads. Other hypothesis that were set were looking at if frequency of each exposure condition (forced and non-forced) caused negative results and if the reaction to ads were going to produce different responses. Both were found to be false as regardless, the attitude of ads was not changed by frequency and the reaction between people who at high and low reactants was too similar to be separated. To wrap up the summary, the type of ad that was presented to participants did cause negative reactions as it disrupted the flow we go into when browsing the internet. The findings in the study are interesting when looking at the wider landscape of modern ads as we consume more content at a higher rate than ever before. Robert Meeds as asked about if modern ads were considered intrusive and if there was regulations around them. His answer surprised us as he mentioned there was more of a unspoken rule about how ads should be made ethically, but that there are no actual regulations on forced ads or how ads are presented to consumers.
What made this study especially strong was the way it looked at both emotional and memory-based responses. After participants viewed the ads, they were asked about their feelings, like whether they felt annoyed, frustrated, or indifferent. Then, they were tested on how much they remembered about the brand that was advertised. This two-part approach helped the researchers understand not just how people felt in the moment, but also how those feelings affected their ability to recall brand information later on. The researchers also looked closely at how different types of exposure—forced versus unforced—changed the way people interacted with the ads. Forced ads, like pop-ups and interstitials, appeared in a way that stopped users from continuing their activity. Unforced ads, like banners or sidebars, were easier to ignore and didn’t interrupt the user’s flow. These differences were important because they showed that how an ad is presented can be just as important as what the ad is actually saying. A surprising finding was that how often the ad appeared didn’t make a big difference. Whether people saw the ad a few times or many times, their reactions stayed about the same. This suggests that it’s the format and timing of the ad that really matter, not just how often someone sees it. It also challenges the idea that more exposure always leads to better results, especially when the delivery method disrupts the user’s attention or mood. In the bigger picture, this study helps us think more critically about how ads affect us every day. It reminds advertisers that people are not just passive viewers—they have emotions and preferences that should be respected. Ads that interrupt and annoy may stick in the memory, but not always for the right reasons. If the emotion connected to the brand is negative, the ad might actually do more harm than good in the long run. Overall, the study shows how advertising effectiveness depends on more than just getting attention. It’s about how that attention is earned. Respecting user experience, emotional reactions, and the natural flow of content is key to creating meaningful, lasting, and effective digital ads in today’s fast-moving online world.
(Written by Luciano Faria, Jade Estarada, Alexys Puche and Monet Andrade)
Dr. Dean Kazoleas is a seasoned communications scholar and the Maxwell Center for International Communications Director at Cal State Fullerton. With over two decades of teaching and research experience, he specializes in public relations, crisis communication, and international campaigns. Dr. Kazoleas has developed global exchange programs, published extensively in the field, and consulted for over 50 organizations. He holds APR accreditation and has served in multiple leadership roles within the Public Relations Society of America.
Dr. Dean Kazoleas has authored over 20 articles in the public relations and communications field, pursuing research on why people lean to certain types of evidence, whether that be qualitative or quantitative, and how that affects the marketing decisions, polling decisions, academic decisions, etc. In his 1993 research study, A Comparison of the Persuasive Effectiveness of Qualitative versus Quantitative Evidence: A Test of Explanatory Hypotheses, Dr. Kazoleas explores this such idea, hypothesising in his vividness hypothesis that qualitative research would be more persuasive, whereas in his under-utlization hypothesis he hypothesized that when knowledge and attitude changes of a quantitative nature were exposed, subject would be more swayed. This interview seeks to explore Dr. Kazoleas’s research methods, reactions to the experiment, and overall methodology, giving greater insight on research at large and strategies to be more successful in research.
(Excerpt from A Comparison of the Persuasive Effectiveness of Qualitative versus Quantitative Evidence: A Test of Explanatory Hypotheses – Dean Kazoleas)
Q1: How did you decide which research method to use?
In his research study, A Comparison of the Persuasive Effectiveness of Qualitative versus Quantitative Evidence: A Test of Explanatory Hypotheses, Dr. Dean Kazoleas asserted that the experimental method was the only appropriate approach for testing his hypotheses. Experimental research relies on a substantial amount of data to test hypotheses and draw meaningful conclusions. Dr. Kazoleas emphasized that “data is power,” stating that no matter how strong an opinion may be, it holds little weight unless supported by data that can validate it as fact.
Q2: Why do you believe that research method was best to use?
Dr. Kazoleas believes the experimental method was the best research method to use because it allows researchers to maximize the differences between independent variables while minimizing error. By using a controlled format, such as a video advertisement, the researcher can control the type and amount of information each participant receives. Essentially, this eliminates distractions or inconsistencies that could occur if participants were reading the material on their own, where they might skim or interpret the message differently. Controlling these factors ensures the data collected is more accurate and reliable for testing the hypotheses.
Q3: What obstacles did you face during your research, and how did you combat them?
Dr. Kazoleas begins by stating that he didn’t anticipate any obstacles during his research, not because he is a genius but because he is well-versed in research and made many mistakes before taking on this research project. Dr. Kazoleas has done many research studies in the past and emphasized how he learned a lot along the way. One of his biggest takeaways from all his years of research, and something he prides himself in, is that he learned how to not make those same mistakes he’s made in the past, and he learned how to not make things hard. He explains how research never needs to be extremely overcomplicated; the simpler the study, the less likely a mistake will be made.
Q4: If you could redo the study, would you change anything in your approach?
Dr. Dean Kazoleas starts by mentioning how advanced technology is now compared to when he conducted his study. While he states that his research was straightforward and simple, if he had to choose one thing he would improve, it would be the video quality. \With today’s technology, he acknowledges that researchers have access to more tools that can enhance the clarity and professionalism of their work.
Offering advice to researchers, Dr. Kazoleas emphasizes the importance of keeping studies simple and focused. He warns against overcomplicating the research process, as many scholars get overwhelmed by excessive data collection. His key takeaway: stay focused on answering the core research question rather than getting lost in an overload of information. Not only will it be too difficult to summarize, but it may also be difficult to organize in general. Scholars may feel the need to include every aspect of their research, but sometimes, it is just not needed.
(Excerpt from A Comparison of the Persuasive Effectiveness of Qualitative versus Quantitative Evidence: A Test of Explanatory Hypotheses – Dean Kazoleas)
Q5: How has this study influenced your perspective on the subject?
Dr. Kazoleas notes that statistics can be challenging for many people to understand. Simply presenting raw data isn’t always the most effective way to communicate research findings. However, modern tools allow researchers to transform statistics into infographics, making information more accessible and digestible for a wider audience.
Looking back, he acknowledges how time-consuming it was to create visual representations manually. Today, software can generate professional-looking infographics in just a few seconds—something researchers in the past could only dream of. He encourages scholars to take full advantage of these resources, as they not only enhance the presentation of findings but also make research more engaging.
Q.6 About the research subjects, were they your students at the time (the 176 undergraduate students enrolled in communications classes at a large midwestern college)?
When looking at the experiment and chosen sample of subjects, it wasn’t immediately clear if Dr. Kazoleas had randomly selected to perform in the experiment, if they were chosen from within his class, or if they were from the entire communications program. When asked about the participants involved in his study, Dr. Kazoleas confirmed that the 176 undergraduate students were not just his own communications students, but rather a collective representative of all the students in the communications program at the large midwestern college. He emphasized that their participation was entirely voluntary and conducted in full compliance with ethical research standards, ensuring the integrity of the study.
Q7: Also, were you surprised by the findings of your study?
Reflecting on the results, Kazoleas admitted he was surprised by what the data revealed. While the research team had initial expectations, the findings offered unexpected insights into student attitudes and behaviors. According to him, the results not only challenged some of their early assumptions but also sparked new questions and directions for future exploration. Dr. Kazoleas mentioned that this study was part of a greater marketing study to see what information and evidence persuades consumers to buy certain products, so getting this insight was a key point to getting to a place where researchers are asking the right questions in the right way to get the sought-after answers.
“Information is power”. Dr. Kazoleas leads others to really challenge their ways of thinking and not go off of ifs, maybes, or tendencies but to really look at the statistics, the data, the information that a study achieves. With data at our fingertips, consumers and businesses alike can better understand their study and what their end goals are, leading to growth and depth in how we understand each other as humans.
By: Jessica Marquez, Audrey Sanchez, Isaias Galvan, and Jacob Medina
Introduction: Dr. Ed Fink is a Faculty Emeritus and Professor of Cinema and Television Arts at California State University, Fullerton. With an impressive academic background, Dr. Fink holds a Ph.D. in Mass Communications and an M.A. in Telecommunications from Indiana University, as well as a B.A. in Theatre and Drama from Valparaiso University. Since joining Cal State Fullerton in 1990, Dr. Fink has played a huge role in the growth of the College of Communications, advancing to Full Professor in 2006 and later serving in multiple leadership roles, including Director of the Faculty Development Center, Associate Dean, and Dean of the College of Communications. One of his most notable achievements was authoring the proposals that created the B.A. degree and department of Radio-TV-Film in 2001, now known as Cinema and Television Arts. Dr. Fink is recognized for his dedication to teaching, advising, and scholarship, authoring four textbooks and publishing articles in respected journals. His expertise has been showcased at numerous professional conferences, and he continues to serve as a division officer for the Broadcast Education Association. Beyond academia, Dr. Fink is known for his successful fundraising efforts, commitment to service, and contributions to both the university and professional community.
Q1. What inspired you to conduct this research study?
When talking to Dr. Fink told us that this study was conducted for his doctorate dissertation back in the 1990s when he was attending Indiana University. The study was originally 300 pages and had been cut down multiple times to the copy that we see today. As a graduate student, Fink was studying mass communication and taking many classes focusing on research methods, where he had to conduct a lot of survey research and design. What inspired him was one of his professors named Egon Guba. He had written a book called Naturalistic Inquiry, which was a study on teacher satisfaction. After his study, Egon realized that they had missed a couple of variables, which led him to question the whole science paradigm. His professor’s questioning of the paradigm sparked an interest in Fink to also ask the idea of mass communication. Unlike physics, chemistry, or civil engineering, mass communications doesn’t just use one method of research but can use all of them depending on the research question you have. With these questions that Dr.Fink was having about mass communication research, he realized he wanted to make this his dissertation topic. By doing so, he was hoping to find clarity on what researchers do in the field of mass communication. From there, he conducted a literary review and found that there were three traditions, social science, interpretive, and critical studies, that all the journal articles he read fell under. Fink then went on to formally conduct his study, trying to prove his thesis that “all or at least most of the research of mass communication falls into these three broad, broadly defined research traditions”.
Q2. What were some of the challenges you faced while conducting this research/collecting the data?
Dr. Fink stated that there were quite a few challenges when conducting his research. The first was intercoder reliability, which meant that he had trouble defining all these terms, along with not quite knowing how to operationalize them once they were defined. He also had issues later down the line with the wording he used in his study, which led to people going over his dissertation to question the findings he collected. The second issue he ran into was with the reliability of his coding. The graduate students that he had hired as coders were not coming up with the same answer for the ten articles they were given, falling short of the 70% reliability standard. To fix the problem, Fink went back and fine-tuned his definitions, along with giving the graduate students ten additional articles to code. This second round got his reliability scores up to where they needed to be. The third problem was time; Fink was a new professor at Cal State Fullerton when doing this research. Meaning that he had to find time in between teaching and treading to collect the data that he needed, often using times like spring break to get most of the intercoder reliability ready.
Q3. If you could do this study again, what would you do differently?
If Dr. Fink had the opportunity to conduct his study again, he mentioned that there are several things he would do differently. One of the main changes he would implement is the use of mixed methods. He explained that mixed methods research has become increasingly popular among researchers today. This approach combines both qualitative and quantitative methods, allowing for a more comprehensive understanding of the research topic. Dr. Fink also discussed using a mixed method design that would include surveying participants, conducting focus groups, and incorporating experiments. He believes this approach would allow for a deeper exploration of the research question and would help capture different perspectives within the data. Dr. Fink also noted that researchers today often work across multiple traditions, leading to less coherence within a single method and more crossover between research approaches. If conducting the study today, he would create new coding instruments that account for this blending of methods, allowing for a more flexible and accurate analysis.
Lastly, Dr. Fink expressed the importance of being open-minded and considering the possibility that mixed methods might even be viewed as a fourth tradition in research. He explained that recent readings and studies he has explored highlight the growing use of mixed methods, suggesting that researchers should recognize this evolving trend and incorporate it into their work. Overall, Dr. Fink emphasized that mixed methods would play a key role if he were to conduct the study again.
Q4. How did you determine how long you would collect the data?
While working alongside his mentor, Dr. Fink aimed to determine how many studies he would need for his research to be considered credible. He was also trying to complete his work within a shorter time frame, ideally avoiding a two- to three-year process. After factoring in the time it typically took him to thoroughly analyze each article, as well as his financial situation, Dr. Fink ultimately decided to work with a sample of 253 studies. Out of those 253, eight were identified as outliers. Upon closer examination, Dr. Fink noticed that these eight studies posed research questions that didn’t align with the conditions of his research. The questions were either too vague or misaligned with the goals and methods of his study, making it difficult to draw meaningful conclusions. Many of the questions appeared to be rooted more in mass communication perspectives, but they lacked clear indicators that would classify them as falling under social science, interpretive, or critical approaches. As a result, Dr. Fink chose not to include these outliers in his final analysis. He did not view them as mixed-method studies either, as they didn’t meet the criteria or demonstrate a clear combination of methodologies.
Q5. What methods did you use to collect and analyze the data? (qualitative or quantitative or a combination of both for this study?
Dr. Fink said that he used a quantitative method for performing this study. While defending his dissertation, it was brought to his attention that it might be problematic to use one of the three paradigms to research them. He argued that the social science paradigm–numbers and data–was the most reliable and formal method for proving whether or not the proposed paradigms were the most commonly used in mass communication studies. Taking his research question into account, “Does it seem that the three research traditions guide the studies that are published in the field of mass communications?” he found that a formal content analysis was the most appropriate method.
Q6. Did the study’s results come out how you expected, or were you surprised by your findings?
Dr. Fink explained that the results of his study came out as he expected. He mentioned that the process involved reading many internal articles to help define the three research paradigms and their operational variables. Out of 253 articles, Dr. Fink found that eight studies fit within these paradigms, which made sense to him. He explained that when a research paper is submitted, it goes through a peer-review process where editors select reviewers who specialize in that specific area. These reviewers come from one of the three traditions or paradigms identified in the study: critical studies, historical, and social science research. For example, Dr. Fink noted that critical studies tend to make value judgments, historical articles often use interpretive methods, and mass media effect studies rely heavily on quantitative measurement and social science techniques. That being said, it did not surprise him that reviewers were often aligned with these paradigms.
However, Dr. Fink was surprised by some findings, particularly with the idea of generalizability within the social science paradigm. He expected to see more instances where researchers stated their study could be generalized to a larger population. He found that most studies limited their claims, likely because their sample sizes were small or specific, such as college students aged 18-25. Dr. Fink also anticipated more generalizability within critical studies, especially regarding calls for social change, but found this to be less common than expected.
For our interview, we had the privilege to speak to Professor Christian Seiter, he teaches human communications specifically about the process of social influence at Cal State Fullerton.
Professor Seiter studied people’s social media reactions during the pandemic and we interviewed him about the process of how he used social media in his research while also sharing his results and what he had gained from doing this study. Professor Seiter based his research on three specific social media sites, Reddit, Facebook, and YouTube. The trajectory of people’s behavior led him to this study, especially when the pandemic that had recently started in March of 2020.
The pandemic was something that has never happened in our lifetimes before which led to new sides of people’s behavior on social media when situations such as country wide lockdown, restaurants were closing for dining in and only serving take out at one point, and toilet paper shortages occurred. The pivotal moment for Professor Seiter to start this type of study was the way the public reacted to Tom Hanks and his wife, Rita Wilson being one of the first celebrities to get covid. People on social media were either struck with hate towards them, while others gave words of hope and encouragement to Tom Hanks and his wife. These two types of behaviors on social media caught Professor Seiter’s attention and was the birth of his study. When interviewing with Professor Seiter, I noticed how he paid close attention to different behaviors on different apps based on community and guidelines. Some apps such as Reddit had a more vocal audience compared to others.
How do people use social media? In his study, Seiter sought to search how social media was being used in the wake of social issues that affected the world. During this research process, he stated “I would look up…you know…what people are talking about.” While actively searching through social media sites, Professor Seiter compared that Reddit was the most reliable source of information for collecting data for his research. Comparing the various different social media sites, he also found that people were much nicer and more helpful compared to the other two sites. Social support amongst people in different social media sites Professor Seiter states that he sought to answer the question of people asking for help and if they were actually receiving it based on the different sites. His primary basis of the study was based on social support. The study focuses on aggressive communication, Professor Seiter sought to find that between the social media sites. “People look for social support and instead they get like uh ‘go to hell” People have different forms of reactions on social media. The communications amongst people on these sites vary. There is a possibility that a person seeking “social support” may receive a positive of a negative reaction. Through his study, Professor Seiter aimed to use social media to see how people communicate in the wake of a big social dilemma. This study being conducted a few years back, relates to how social media is used at all times to show how communication online is used to talk about social issues. There is a potential to use facts but there is a great potential that misinformation can be spread, specifically pertaining to issues that can affect many people. Side model and anonymity, their role in social media Professor Seiter connected the side model and anonymity by examining how the side model doesn’t necessarily cause anonymity to lead to antisocial behavior. By comparing and contrasting, he explained that while some people use anonymity to say negative things online, there are also those who use anonymity for acts of kindness. The primary focus of the side models is the norms of the groups where anonymity occurs. For example, if we look at a group that is constantly giving back to the community and doing charitable things, the side model proves they are most likely to use anonymity for good. On the other hand, those who usually show negative, anti-social behavior use anonymity to continue those behaviors and negatively impact the use of social media to bring others down. Seiter even goes on to say how anonymity can intensify it; for the group of people normally doing nice things or constantly being negative, anonymity is a factor that truly increases said behavior. Professor Seiter emphasized that the side model serves as the foundation of the study. At the same time, anonymity acts like a magnifying glass, helping researchers understand why people behave differently when given the option to remain anonymous.
A real-life example based on this study that Professor Seiter provided during the interview was the contrast between Facebook and Reddit during the pandemic. They found during the study that Facebook primarily uses real names and real personal photos, while Reddit has the aspect of anonymity; you can use made-up names and don’t need much personal information displayed, unlike Facebook. Reddit also has a system called Reddiquette, which is essentially a guideline on how to act on the platform. In Professor Seiter’s words, Reddiquette is “how not to be a jerk.” Reddit also has an upvote and downvote system; essentially, if a person makes a negative post or comment, users can downvote it, and the post/comment goes down the thread and doesn’t gain much foot traffic because of the negative votes. This system ensures that helpful and positive information is displayed and prioritized rather than negative. Because of Reddit’s anonymity and the systems in place to control antisocial behavior, many found Reddit to be more helpful than Facebook during the pandemic.
Another example can be found on YouTube, where there isn’t much regulation on what is being said/posted compared to Reddit. Professor Seiter emphasized that the lack of regulations on YouTube causes those with antisocial behavior to flock toward that website and post what they can’t on sites like Reddit. Professor Seiter also stated, “The issue isn’t anonymity itself, the anonymity plus the anti-social norms, that’s where you have a problem.”
After the interview with Professor Seiter, the first question that came to mind is how different his study could be now. Since the pandemic, the whole world has become so reliant and frankly addicted to social media and for many; it is their only place of socializing, and because of this communication via social media has intensified. People online have become even meaner because they have become so comfortable being behind a screen. As well as at the time of this study Professor Seiter conducted, it was just the start/rise of TikTok. If TikTok was added into his study would his findings come out different?
Work Cited:
Seiter, C. R., & Brophy, N. S. (2021). Social Support and Aggressive Communication on Social Network Sites during the COVID-19 Pandemic. Health Communication, 37(10), 1–10. https://doi.org/10.1080/10410236.2021.1886399
By: Eli Lopez, Anthony Aguilar, Madison Arellanes, and Jackson Butler
Dr. Kelly D. Blake, ScD. Courtesy of the National Cancer Institute
Introduction
It was with great pleasure to conduct our scholar interview with Dr. Blake, a Health Scientist and Program Director in the National Cancer Institute’s Health Communication and Informatics Research Branch as well as Director of NCI’s Health Information National Trends Survey, focusing her research on public support for different tobacco use policies and allowing for us to get further insight into her research process.
Q1: Can you explain why you decided to research the advertisements of tobacco products and their effects on tobacco use?
Dr. Blake mentioned that when she was in her doctorate program, her doctoral dissertation work was in public support for different tobacco control policies. She spoke on how this study is a continuation of that interest. At the National Cancer Institute, Dr. Kelly Blake directs the health information national trends survey – HINTS. HINTS is a communication based survey. She feels as though the survey that she directs was a good place to begin looking at peoples opinions to reduce advertising and product placement at point-of-sale and on social media. A few years ago, the NCI had a monograph on tobacco and the media landscape. This was a synthesis of all the data that existed which shared the total weight of evidence of studies from the US and abroad. This demonstrated a causal relationship between tobacco advertising and promotion and increased tobacco use. She says that point of sale advertising is associated with impulse purchases of cigarettes which encourage smokers to purchase more cigarettes as it is a trigger. There is also a prevalence of tobacco use on social media, most platforms have restrictions on tobacco advertising but more can be done. She and her team were interested in documenting the public’s support for policy to limit this kind of advertising and placement. Dr.Blake said this is because oftentimes public policy can drive a policy maker’s decision on whether or not to take on a policy.
Q2: How did you decide which research method to use? Why did you choose the next birthday method?
Dr. Blake feels that the best way to assess public opinion is to ask people what they support or oppose. HINTS is an established health communication survey. They felt survey research was the best because they could do so in a naturally representative sample. They chose the next birthday method because it allowed them to select a person at random within a household. It allows an equal probability of selection if they were to be able to make inferences at the population level.
Q3: How did you determine how long you would collect the data?
Dr. Blake mentioned a piece of literature called “Total Design Survey Methods” by Don Dillman where there would first be a postcard invitation or a letter invitation. This is then followed by a packet that actually contains the survey which would be sent again a few weeks later and the process repeats. There was the initial contact followed by several follow ups which were all scattered over the course of 7 or 8 weeks. Dr. Blake said that they have modified that protocol a little bit. She said that they use an established method of going out to households to try and get them to complete the survey. They would then just wait until they saw responses start to trickle in.
Q4: How did you envision this research contributing to the understanding of product placement and advertisements in general? How do you expect it to contribute or affect the advertisements of tobacco?
Dr. Blake stated that in 2009 under the Obama Administration, the FDA implemented the Family Smoking Prevention and Tobacco Control Act which gives the FDA the authority to regulate tobacco advertising including at the retail level. It also preserved the rights of states and travel entities so they could have their own tobacco retail regulations. In terms of product placement, Dr. Blake said that public health experts at the National Cancer Institute prohibit point of sale advertisements such as at gas stations or places where people are near product displays. There have long been policy proposals to limit that paid advertising emplacement. Those policies have proven to be successful in other countries which puts these other countries far ahead of the United States.
Q5: When creating the survey questions, how did you conclude which questions you would ask?
Dr. Blake said that if you work in survey research, everyone has different measurement objectives on your team. HINTS makes sure that the questions are measuring the concepts to which they’re mapped. The first thing they did was think about what the gaps were, such as what they do not know about policy support in the United States for these communication related tobacco policies. The next thing they do is think about where they could be helpful when informing about policy and limitations. They also reach out to internal and external experts on the given topic. They serve as “champions” with this sort of content and they propose different survey questions. Next, they go through several months of testing where they take 15-20 people (not randomly, by opt in) and give them each $75-$100 to either come in person or do it online to go through the questions that have been proposed and ask them what they think they’re trying to ask in the question. They look at the results from that and then they do it again with another round of people.
Q6: We know that your research survey was conducted over the course of five months; how long did this research take to complete from start to finish?
Being a nationwide survey, compared to smaller studies, from start to finish Dr. Blake’s research time frame was an incredible 18 months.
From several development tests to choosing the sample addresses from the postal service to finally finalizing data packages in order to be ready for public release so the data can be used for under different releases.
Q7: What were some challenges you and your team faced while conducting your research and how did you navigate the issue(s)?
Conducting and executing research and surveys is no walk in the park. For Dr. Blake, when asking policy supporting questions, stated that a roadblock throughout her research was figuring out how to navigate the sample that had neutral responses that neither support or do, rather in the middle.
In their particular study, they locked neutral in with opposition, not because it necessarily falls within the opposing opinion but rather because this group can be considered people who can move their opinion. Leading for the researchers to move forward with the plan of how to move the opinions to those who are neutral v.s supporters, especially when looking at policy you want to move people in one direction.
Conclusion
Concluding the interview, Dr. Blake left our team with a fresh perspective on how to move forward when conducting research. We understand that research takes time, and that conducting research spans over months, depending on its scale. Ultimately we are left with an understanding that in order to have a truly random sample, you must allow for an equal probability of selection when surveying. Dr. Blake and her team’s research concluded that the majority of adults in the U.S. support prohibiting tobacco product placement on social media. Through their research, they were able to see how the support from U.S. adults varied by age, sex, education, rurality, and children in the household. Dr. Blake informed us how much work and thought goes into conducting research and holding surveys, which will allow us to have a greater vision into our own term project.
Photos courtesy of Blake, et al., 2021, U.S. Public Opinion Toward Policy Restrictions to Limit Tobacco Product Placement and Advertising at Point-Of-Sale and on Social Media
Our group chose to interview Bayla Gomberg, a communications scholar with a master’s degree from California State University, Fullerton. Bayla shared her journey through communications research, which began with her undergraduate studies as an advertising major. Initially drawn to social media topics, she developed a strong interest in understanding body image, the influence of celebrity culture, and the effects of parasocial relationships in online spaces. Her early curiosity about how social media affects self-perception and identity sparked a desire to conduct research on these themes.
During our discussion, Bayla explained how she approached developing a focus for her projects, describing the challenges and rewards of honing in on specific areas within communications. For Bayla, finding her research path meant paying close attention to the questions that intrigued her. She connected each topic with a research question, then linked that question to a specific behavior, allowing her to dive deeper into the nuances of social media’s impact. By following what interested her most, she built a foundation for exploring new ideas, seeking to understand the complex interactions between media and individuals. This method, she emphasized, was crucial in guiding her studies and refining her focus.
Our conversation offered a valuable look at Bayla’s commitment to exploring real-world questions through academic research, and it underscored the importance of personal passion in selecting and sustaining a research focus. Her journey highlights how pursuing specific questions can illuminate larger trends in communications.
During our conversation, Professor Gomberg highlighted her experience with research and particularly content analysis. She elaborated on image collection for content analysis and creating themes and ideas based off of findings. In her specific area of study she mentions research on body image and the brand Skims. Through the content analysis that was conducted Professor Gomberg looked at different brands that sell shapewear and the different images that could persuade people. When explaining this process she also emphasizes the use of decoding and how it’s important when creating different analyses from the different images and content collected. Bayla also recommended using theories that you understand and know to apply rather than one that you are not as familiar with.
When asked about any challenges she may have faced when collecting data she specifies a few different scenarios. When collecting data at the University level Professor Gomberg says having a wide age range in the population was difficult. Since most students working on a group project range from 18-24, similarly, the people taking the survey were the same age because it’s either peers taking the survey or the people you interact with on social media who are also a similar age. When it comes to content analysis Professor Gomberg explained how “objective opinion” became an issue because when looking at content and different images it’s difficult to not view it with your own opinion. She advises to be critical of the images you are analyzing for content analysis. Finally, when it comes to conducting focus groups at the college she states that it can be expensive and time consuming. Getting everyone to agree on times and be able to meet can be a hassle. Bayla also elaborates on how the opinions of the focus group members can affect each of their responses. If they feel judged, embarrassed, or like they are not in a safe space, their answers may not be as truthful. She addresses this by stating the importance of being organized with your questions, and creating a safe environment where the participants can freely and truthfully provide the data needed.
Professor Gomberg provided much needed insight of the importance of content analysis and how to navigate the difficulties that come with trying to collect data and gather volunteers to participate.
Professor Bayla Gomberg has really given us some valuable insight into how to conduct a research project and the benefits and challenges that come with it. For instance, in selecting participants for surveys or focus groups, she says that it really depends on what your research question is. For example, if your research question focuses on social media, then you would find your research participants on social media. Or if you are looking for someone who has seen a particular movie, there might be a possible chance that they might not be on social media, so you would have to extend your search to a physical place such as Walmart or better yet, a forum or a website that does movie reviews. Then you would have to find a specific target audience for your research like a certain age demographic and you would hope to collect “enough people in that age range.” Professor Gomberg has also advised us to not limit your survey research to one website and instead post it on a bunch of different websites because you want a good population and/or sample size from around the world or nationwide. The main important lesson she told us she has learned about participant recruitment is to be organized and to “be on the same page as your other researchers.” And that you’re always trying to get to your target audience and understand them from their level; rather than just being “the researcher on the other side of the screen.” In the end, you are trying to get the most authentic response from your research participants.
For specific tools and/or software she uses for data organization and analysis, she recommends using Qualtrics through the Cal State Fullerton website. As you can easily type out which question you want and get the answers selected; you could also design it to make it look professional and once it is posted, you could actually see the responses coming in live as you want to make sure that people are “learning and doing what you need them to do.” She mentions that with Qualtrics, since you are using quantitative data, it tells what the results equal to a certain amount of percentage and it does all the math for you, so you are not spending a lot of time adding up the responses that amount to the percentages. Finally, you could change those results to what is more visually pleasing to the eye and what could you understand better through pie charts, graphs, etc. For presenting the results, Professor Gomberg used Google Decks and copied and pasted the results onto those decks and explained what the results mean.
Additionally, what Professor Gomberg recommends to ensure that your data findings are reliable and reflect accurate insights, is to look at the data, the research question, and the majority of people who responded in a specific way. However, she emphasizes to look at the responses that were the opposite result of what you were expecting or have a very low percentage as well because, in her own words, “just because something is a majority, does not mean that is everybody and that is not the most important part of the research.” You have to look at the minority results and ask yourself, why did so many people not find it this way? Or not behave in this sense? As it really depends on what you’re researching, it is important to look at both sides of it and to “dive deep” into the participant’s answers rather than just looking at the numbers. The true explanations of your percentages is what brings your research to another level, in contrast with just saying, “here’s the results!”
In sum, Professor Gomberg’s answers on conducting your own research was really helpful and insightful to our future research projects within the communications major.
Some of the main takeaways I had after our discussion with Professor Bayla Gomberg were that the most important part of doing research is ensuring that you are organized and have a plan before you even start collecting different kinds of data. She especially emphasized the importance of being organized in group projects as well because, when working in a group setting, it can be very easy to get lost. By making sure that everyone is on the same page, everything will be more cohesive when it comes to the way you work together, conduct research, and write your essay.
One statement that Professor Gomberg made that really stood out to me was when she said, “The theory drives the research and really provides support to the entire argument you’re trying to make. Without the theory, the argument wouldn’t really be an argument.” This highlights the importance of understanding the theory you are using and explaining your results as if you were explaining them to a two-year-old, making them clear and concise. She also mentioned that it’s okay to keep it simple when conducting a study; for this reason, it’s fine to choose one specific aspect to deeply explore in your research. Focusing on one variable can help break things down, and it’s okay not to address every single question you posed in your research unless specifically recommended by a professor.
Overall, she advised us to research different topics and challenge ourselves to collect data in various ways, such as through focus groups and exploring different subject areas.