Chapters
1 Justice-Centered Science: Understanding How Science Can and Has Impacted Communities
1.1. Learning Outcomes
Learning Outcomes
This chapter focuses on understanding different perspectives on science and the role science has played in society. The major learning outcomes include the following:
Students will be able to
- Identify the key components of Western science practices
- Explain how science practices have resulted in harm historically and contemporaneously
- Define justice-centered science
- Articulate a code of research ethics that incorporates an understanding of justice-centered science
1.2. What Is Science?
At its broadest, we understand science as an approach to understanding and interpreting natural phenomena through processes such as empirical observations, and it is an endeavor pursued in all human cultures. It is important to ground this discussion of justice-centered science in an awareness that there are other ways of pursuing and transmitting knowledge and understanding of the natural world. However, for the purposes of this text, we are specifically discussing, and distinguishing, Western science as a construct because it is the dominant influence on how science is practiced, implemented, funded, and discussed in cultures that are European or were previously colonized or controlled by European countries. Further, by framing this conversation within the Western tradition of science, it provides a better foundation for understanding justice- centered science as an approach to addressing inclusion and equity issues within Western science. Western science emphasizes concrete frameworks and a search for objective truth built on Eurocentric knowledge traditions [Mazzocchi, 2006]. According to Mazzocchi [2006, p.464] “Western science is objective and quantitative…and is based on academic and literate transmission.” Therefore, when we use the term “science” in this text, we are specifically referring to the Western tradition of science.
Although different approaches and methodologies can be used to practice science, it is a process of discovery, constantly evolving in the face of new evidence toward a better or richer understanding of the world derived from those scientific processes. Scientists are taught not to take things at face value but instead to question assumptions while seeking evidence in general before accepting claims made by others or confidently asserting any results produced themselves. The scientific method—making observations, forming hypotheses, running experiments, and analyzing data—historically has been used as a way of explaining natural phenomena. This is critical in that the scientific method is intended to recognize and minimize bias. A foundational principle of Western science is critical thinking and skepticism. Despite the scientific method seeking to make science objective, as a human endeavor, science is still framed through a particular sociocultural lens. All research has some bias because it is done by human beings or systems created by human beings. This comes into play with the selection of which questions are worth researching, the decision about what to include or exclude in a study, and the ways that expectations and implicit biases can influence interpretation. Science as practiced can also be overly rigid and reliant on quantifiable data, which can miss the complexities and nuances of social phenomena and can lead to unintentional harm and narrow understanding. Embracing the subjective nature of knowledge and acknowledging culture and human values in the research process should also play a role [Guba and Lincoln, 1994]. Thus, although the scientific method has value, it may not necessarily be universally applicable, particularly in fields investigating the natural world.
Science has brought dramatic changes to societies by allowing great technological developments, ensuring better living conditions, and providing new ways of understanding the world. For example, antibiotics have saved millions of lives, extended life spans, and improved living conditions. Although societies have always had medical and health practices to respond to and treat medical issues, science revolutionized those practices with antibiotics, and the study of antibiotics has continued to advance health and health care. However, science can have its own problems. Dilemmas of a moral or ethical nature may arise in how science is practiced, what is researched, and how scientific discoveries are used. Scientific discoveries can be misused and have been so throughout history. The eugenics movement, for example, was grounded in the application of genetic science to the improvement of humanity. The belief that some humans were genetically superior was grounded in what was considered to be rigorous science during the 19th and early 20th centuries. At the extreme end of this, strict immigration laws and even genocide were perpetuated during World War II by Nazi Germany [Kitcher, 2001]. The pursuit of science is not just about the discovery of new knowledge. It should also include application of this knowledge for the betterment of humanity—for all people on the planet.
1.3. How Has Science Harmed Communities?
1.3.1. Ways Science Has Harmed Communities
Although science has benefited society in many ways, it also has caused harm to numerous vulnerable communities throughout history. Examples of harm science has caused are evident throughout the sciences. One recent and notorious example, which even received a formal presidential apology from Bill Clinton in the 1990s, is the “Tuskegee Study of Untreated Syphilis in the Negro Male” by the U.S. Public Health Service. The study was carried out in Tuskegee, Alabama, over 40 years, starting in the early 1930s, and involved hundreds of Black men. The majority (67% of participants) were infected with syphilis, a sexually transmitted disease that left untreated, can lead to significant and even fatal health conditions. Participants thought they were being treated for “bad blood,” a term used to describe several possible diseases, including syphilis. Informed consent was not collected from the participants. Rather, in exchange for being in the study, the men received benefits like meals, medical exams, and burial insurance. Despite penicillin becoming a widely available cure for syphilis in the 1940s, it was not offered to the men. The researchers did not make an effort to help the men in the study with available medical treatment for a life-altering, fatal disease, and they did not release their research findings until the disease had taken its course, thus dooming these men to pain and, in some cases, death [Jones, 1993].
Another example, also rooted in medical research, involves Henrietta Lacks, an African American mother of five. In 1951, Lacks was being treated for cervical cancer at Johns Hopkins Hospital. During her treatment, cells were taken from her without her knowledge or permission and experimented on by Dr. George Gey, a researcher who had been collecting cells from all patients with this type of cancer. Gey’s previously collected samples died quickly in the lab, but Lacks’s cells did not. Rather than dying, Lacks’s cells doubled in number every day. Known as HeLa (pronounced hee-lah, as taken from the first two letters of her first and last names), these cells became one of the most important tools in medical research because they were found to be durable and prolific, meaning researchers could study the effects of various treatment options on the growth of cancer cells without experimenting directly on humans. Johns Hopkins never sold or profited from the discovery or distribution of these HeLa cells but did make the cells freely and widely available for scientific research. Although her family has since been compensated and Lacks’s contribution to science has been recognized, her case underscores the exploitation and lack of informed consent often experienced by marginalized communities in scientific research [Skloot, 2010].
Discussion Questions
- Johns Hopkins states that what they did with HeLa cells was legal, but was it ethical? What harm was caused by using Lacks’s cells for research without her knowledge or permission? Who was harmed and how?
- Although Hopkins, and other institutions, did not directly profit from using HeLa cells, did they profit indirectly?
The development of oral contraception highlights another example of “scientific progress” that was cultivated by exploiting vulnerable communities, in this case women from a colonial territory, for the “betterment” of humanity. In the 1950s in the United States, the first birth control pill, Enovid, was developed from mass-scale unethical clinical trials on Puerto Rican women. These trials, run by Gregory Pincus and partially financed by Margaret Sanger and Katharine McCormick, were carried out without informed consent; the researchers gave no information about the risks to the subjects. The trials had dangerous side effects and caused the deaths of three women, with no consequences to the researchers.
Discussion Questions
- Why do you think researchers did not feel like they had to inform participants of the potential side effects of the medications used in the trial?
- Why do you think the researchers faced no consequences?
https://www.history.com/news/birth-control-pill-history-puerto-rico-enovid
Although it may initially seem that the primary avenue of harm from science is in the medical field and, arguably, medical research has worked to eliminate these types of harms, the physical sciences have also contributed harms to communities. For example, science has played a role in environmental injustices by subjecting low-income and minority communities to harm. Leaded gasoline and lead-based paints were introduced in the 1920s. Today, we know the detrimental health effects of lead for humans, but it took 50 years to identify those negative health effects, and the economic and social complexity of recognizing, removing, and cleaning up those harms resulted in a disproportionate negative impact for low-income and minority communities. Eventually, regulations were put in place to remove the use of leaded gasoline and lead-based paints, but the damage had already been done. Sustained use of leaded gasoline in cars and lead paints in homes and schools led to significant public health impacts—mostly in urban areas with predominantly underrepresented communities. Children in these communities who were exposed to lead poisoning experienced cognitive impairments and additional severe health problems. However, public health officials, and scientists, failed to identify or properly address the risks until after many thousands of children throughout the United States had suffered such poisoning over a span of several decades [Needleman, 1993].
As another example, in the 1980s, Robert D. Bullard, the father of environmental justice, researched waste disposal facilities in Houston. The study was the first comprehensive account of eco-racism in the United States, given that it found polluting facilities and privately owned landfills were primarily sited in Black neighborhoods, even though Black people made up only a quarter of the city’s population. His discovery prompted Bullard to begin a lifelong academic and activist campaign against environmental racism.
Harm is not only caused by ethical and moral lapses in the practice of science. Harm is also caused when science is not practiced inclusively and when scientists don’t challenge their own inherent beliefs, even when the science is practiced in what people believe is a rigorous, objective way. Science is shaped by the questions asked, the choices of what to study, and the perspective of the researchers. The historical lack of diversity in scientific research teams as well as in human subject research has also contributed to the harms science has created. For instance, lack of diversity in human subject research has led to medicines, treatments, and even biometric data that serve only one population. For example, modern biometrics still have trouble with recognizing faces with darker skin tone: Although errors in detection for lighter-skinned people are less than 1%, they soar to more than 30% for those who are darker skinned. Pulse oximeters, devices worn on the finger and used to measure oxygen saturation in the blood, were primarily tested and calibrated using individuals only with lighter skin tones. This has led to inaccurate readings (overestimates in oxygen saturation) in people with darker skin, the result of which could be potentially fatal in critical care situations, like during the COVID-19 pandemic. Even research that is designed to engage communities through citizens collecting data has been found to exclude communities with poorer education attainment and poorer economic conditions in those data collection activities, resulting in data skewed toward affluent, educated communities. White, affluent communities have been more likely to participate in these types of citizen science activities, in part because systemically, communities of color are more broadly impacted by economic and educational marginalization and therefore have more barriers to participation. The result is that data collected through citizen science may have a sample bias that leaves out data from marginalized communities and their environments (Mahmoudi et al., 2022 ).
For more discussion, consider the following stories for opportunities to explore the complex relationship between communities and science.
Discussion Questions
- Review one or more of the case studies above and consider how these examples might reflect the ways science can negatively impact communities. Consider the case studies from multiple perspectives. Are there other explanations? Is there intentional harm evident? What are the root causes of the harm to communities?
- Beyond those listed above, do you know of (or can you find and share) an instance of environmental injustice or racism? How did it come to light, and if not ongoing, what was the outcome for the individuals or communities involved?
1.3.2. How We Practice Science Leads to Harm
As addressed in the previous section, many communities may be skeptical or uncertain about science and scientists because of historical experiences. This is not an anti-science position; communities of color and low-income communities, such as in Flint, Michigan, have valid historical reasons for this distrust [Ramirez-Andreotta, 2019], having often been ignored or silenced by the scientific community in dehumanizing ways. The Flint water crisis in Michigan is a complex example of a community suffering from long-term harm, whose concerns were ignored by governmental and public health officials and where science was needed to understand and prove the harm caused. It is also an example that demonstrates how science, or lack of science, contributes to a cycle of harm in which public officials are not held accountable for making decisions without the scientific data and for ignoring public concerns and the science was not readily available to the community to substantiate their concerns. The crisis started in 2014 when the city changed its water source following cost-cutting measures ordered by a state-appointed emergency manager. The system used an improperly treated water source, allowing lead from deteriorating pipes to leach into the treated supply. Residents reported discolored water with an odor and a range of health ailments, and testing finally found lead at potentially harmful levels. For months, residents had raised alarms about their worries over the water, and for months, local and state officials dismissed them. Although progress has been made in valuing local community observations and expertise [e.g., Dosemagen and Parker, 2019], empirical research on science education illustrates how our understanding of how to practice science has been part of the problem and how it can be transformed into the solution through community-driven science.
Some ways the practice of science can harm communities can be categorized as follows:
- Creating direct and/or indirect (structural) harm to individuals
- Limiting access to resources
- Disrupting community norms and practices
- Limiting understanding of phenomena because of implicit bias, overreliance on objectivity, forced objectivity practices, or ignoring alternate perspectives
- Ignoring the negative impacts of a phenomenon or solution that might have a negative effect on a portion of a community
- Ignoring, minimizing, or deprioritizing concerns or priorities of certain communities
- Reinforcing, elevating, validating, or amplifying harmful ideas or stereotypes
The most obvious of these harms is the direct harm to individuals. Although there are processes put in place to address this type of harm, such as institutional review boards and informed consent, the impact of research, scientific solutions, or findings can harm members of minority communities if they are not explicitly recognized and included in the research.
Science can cause harm, particularly to communities with less agency and recognized power and fewer resources because they are excluded from problem identification, exploration of impact, and other parts of the process where their perspective might provide a more holistic understanding of a phenomenon. Structural harm, resulting from that lack of inclusion, can become embedded in institutions like health care facilities and educational institutions, which then perpetuates inequality and social injustice.
The way we practice science often limits access to resources and limits agency for marginalized communities in the research process. These limitations are fueled either directly by not making data or findings available or reusable or by limiting access to funding through grant and federal funding eligibility requirements. A further factor in this harm is limiting access to researchers who are restricted by funding, as well as institutional or publishing requirements and expectations. These types of restrictions result in marginalized communities being left out of the research practice, and thus, they are unable to advance issues that matter to them or findings are skewed without representation, diverse data, and alternative explanations.
Science can disrupt the norms and practices of a community when it is done in a vacuum, without recognition of the impact on all communities or all members of a community. This happens in several ways. Disruption of traditional practices or norms can happen when a scientific solution is imposed with no observation or recognition of how traditional knowledge reflects an understanding and connection to local phenomena. When scientific solutions are implemented without full inclusion of marginalized communities or marginalized members of a larger community, those solutions can be detrimentally disruptive of existing cultural practices. When we reject alternative ways of understanding phenomena, we not only lose out on innovative understanding, but more importantly, we hurt communities that we should be helping. To consider both types of harm, take the experience of Inuit youth in Nunangat in Canada. They were faced with an expectation that they implement science-focused climate solutions that invalidated their cultural understanding of how to live with their land. The externally imposed science solutions undermined their cultural identity, and the expectation that they were responsible for implementing those solutions created tension related to their identity and connections to community. They had to reconcile adapting to climate changes to their native land with non-Indigenous education that has erased Indigenous ways of living with that land and imposed expectations that they implement climate solutions that may not necessarily align with their cultural knowledge [Tagalik et al., 2023].
As another example, construction of hydroelectric dams seems on the surface like a positive for communities because these dams provide renewable, reusable electricity in areas that may have limited options for electricity. However, the construction of those dams also changes food sources and living practices for communities near the dams. The ecological, cultural, social, and economic impacts of hydroelectric dams on local community norms and practices are only now being acknowledged through the work of social scientists [Fan et al., 2022]. Although some scientific studies, such as the one by Rocha Pompeu et al. [2022], look at the physical impact of hydroelectric dams on the environment, they do not cover the impact on the communities and how the people in those communities live and thrive. This is an example of the way science can harm communities by ignoring the impact of science solutions on communities and people in those communities.
The last two types of harm focus on how we understand the world through science. First, researchers control the research agenda, and when they ignore or deprioritize community concerns, they invalidate the real concerns of real people for whom science could help provide understanding and solutions. The earlier discussion of the Flint, Michigan, water crisis is an example of this type of harm. The deprioritization of community concerns perpetuates existing marginalization and systemic oppression of certain communities.
Further, science can also be used to reinforce existing harmful beliefs about marginalized communities. When science is conducted while taking existing beliefs as established truth, findings will amplify harmful beliefs and stereotypes. A long-standing example of this is Samuel George Morton’s work in the 1800s, in which he measured the cranial capacity (the volume inside of the skull) and connected cranial capacity to brain size and postulated that brain size equated to intelligence. He used the skulls as physical evidence supporting racial characteristics and hierarchies. Among other things, he concluded that Caucasians had the largest brain size and therefore were the most intelligent among the races. Even though there were contemporary critics of Morton’s findings, including Charles Darwin and Frederick Douglass, his scientific approach and findings associating cranial features to racial traits, assigning intelligence as a racial trait, persisted through the 20th century. His findings have been used to justify and validate flawed beliefs about humans, including the eugenics movement and other flawed scientific arguments that some races are more intelligent and more socially and morally capable than others. [Wade, 2021; Mitchell, 2018] Notably, according to Mitchell [2018], Morton’s data and scientific method may have been valid and unbiased, but his conclusions and theories about race and racial science were not. Mitchell [2018] notes that a contemporary researcher of Morton, Friedrich Tiedmann, had similar data results but reached entirely different conclusions about racial differences. Mitchell [2018, p.1] says, “Tiedmann and Morton independently produced similar data about human brain size in different racial groups but analyzed and interpreted their nearly equivalent results in dramatically different ways: Tiedemann using them to argue for equality and the abolition of slavery, and Morton using them to entrench racial divisions and hierarchy.”
Addressing past wrongs and preventing future ones hinge on scientific endeavors upholding high ethical standards, transparency, and the inclusion of diverse voices. The scientific community needs to reckon with painfully learned lessons of the past but can learn to do better to achieve a more just and equitable future. Science has a long history of being used to justify racism and marginalize other systems of knowledge and action. This dominance has resulted in diverse voices and perspectives being left out of scientific decision-making efforts, as well as a lack of clarity surrounding how science is done. These factors contribute to the divide between the scientific community and local communities, highlighting the need for community-driven science [Liboiron, 2021].
Examples
Consider either of the following case studies
- Union of Concerned Scientists, Double Jeopardy in Houston: Acute and Chronic Chemical Exposures Pose Disproportionate Risks for Marginalized Communities
Discussion Questions
- What is your perspective on science and its impact on society?
- Ask a family member, community member, or friend about their views on science and its impact on society. Can they share any particular examples of why they might trust (or be skeptical about) scientific evidence on an issue?
1.4. What Is Justice-Centered Science?
Justice-centered science is an interdisciplinary field of work that prioritizes and weaves together social justice principles and scientific research and practice. This approach emphasizes the need to correct systemic inequities and ensure that far from serving as a means of control, scientific advancements are dedicated to ensuring a just future for everyone [Morales-Doyle, 2017]. Key questions that shape justice-centered science, according to Ambitious Science Teaching, include the following: How is science done? What counts as science? Who gets to do science? Further, justice-centered science critically asks not only who conducts this research but also whose interests are being prioritized and who is benefiting from the scientific knowledge and technological advancements. Justice-centered science critically considers how inequity and systemic oppression and racism shape the practice, results, and impact of science and takes steps to address identified harms. By promoting diversity in research teams, including moral and ethical decision-making and equitable scientific education and resources, and utilizing science to combat social injustices [e.g., Schiebinger, 2008], justice-centered science fosters a more inclusive scientific future. Multiple lenses and perspectives can provide a more holistic understanding of natural phenomena. For example, in the realm of environmental science, though chemical analysis is important, social science insights to human behavior and policy impacts can be argued to be equally important.
Justice-centered science is important because it addresses past and current injustices in science and technology. For instance, marginalized groups have often been underrepresented in scientific research both as subjects and researchers, leading to gaps in knowledge and biased outcomes that perpetuate inequality [Ottinger, 2013]. Additionally, in the name of science and objectivity, these marginalized groups have been exploited or used for tests without their consent. What justice-centered science means in practice is taking responsibility for ensuring that scientific work is designed and performed so that it incorporates a broad spectrum of ethical considerations. It either works toward lowering disparities, or it creates mutual well-being across all communities and societies. This strategy is consistent with larger social movements that call for more diversity and inclusivity across fields, underscoring the role of science as a driver of social progress and fairness [Morales-Doyle, 2017].
1.5. Practicing Justice-Centered Science
The approach of justice-centered science aims to avoid not only the harmful consequences of science for marginalized communities but also to eliminate the inequitable distribution of scientific benefits and build on the participation of communities impacted by the science. Justice-centered science prioritizes equity and inclusion in every stage of scientific work and recognizes the role power and systemic inequities have in the practice of science. One element of justice-centered science includes the intentional involvement and input of vulnerable and marginalized groups in the design, implementation, and use of findings of the research endeavor. Methods of participatory research, in which community members are partners rather than subjects, help to ensure that research is designed to address the actual interests and concerns of communities [Israel et al., 2012].
Within a justice-centered framework, researchers have an ethical obligation to ensure their research is not complicit in the creation of harm or injustice. This obligation includes accounting to the communities they investigate in regard to their methodologies, their motivations, and the potential implications of their study [Resnik, 2015]. Environmental justice is a prime example of justice-centered science addressing the unequal distribution of environmental hazards in marginalized communities. For example, the Flint water crisis made it clear that environmental science is in need of a justice-centered turnaround: Researchers and activists were able to work together to address the health and rights of the large African American population, whose water was contaminated by lead [Pulido, 2016].
1.6. Code of Research Ethics Incorporating Justice-Centered Science
As a result of some of the obvious and identified examples of scientific harm discussed previously, research ethics have been established to safeguard the quality of human life in most scientific investigations, especially in regard to any studies conducted on human subjects. However, with our awareness of both the ways science can harm communities and that science is not practiced in a vacuum, there is increasing acknowledgement that science is a social justice issue. Justice-centered science is focused on the interconnectedness of science and social justice. This approach recognizes that scientific work always occurs in a social context and should find ways to benefit all members of society, rather than just focusing on privileged communities.
Research ethics codes are traditionally based on several principles, as outlined in the Belmont Report [National Commission for the Protection of Human Subjects of Biomedical and Behavioral Research, 1979]:
- Respect: A principle focused on the autonomy and dignity of study participants. Informed consent is an incredibly important practice that guarantees the participants understand what is being researched and the possible risks and benefits to them.
- Beneficence: A principle focused on ensuring that the benefits of research outweigh the risks and minimizing harm to participants.
- Justice: A principle focused on the fair distribution of benefits and burdens from the research. No group should bear all the burden of research in terms of risks or be systematically excluded from opportunities to benefit.
Justice-centered science incorporates principles of social justice and equity directly into the research process. Some of its key components include the following:
- Inclusivity and representation: Researchers must include diverse participants, particularly those who have been historically excluded and underrepresented populations of the scientific community.
- Community participation: Researchers must work in active partnership with community members and often interpret results together. In this way, researchers are able to realize the impact and potential power of their research on communities that otherwise might be overlooked.
- Equitable benefit distribution: Researchers must work to create fairness in the distribution of benefits of the research throughout all levels of society. This principle involves thinking about whether or not research findings can help redress social injustices and lead to better lives for those who have traditionally been marginalized [e.g., Smith, 1999].
- Open transparency and accountability: Researchers must enable open transparency, including being held accountable to participants as well as society. This principle can be accomplished through open and honest communication about research goals, methods, and potential nonbeneficial or harmful consequences of the study [Resnik, 2018].
Discussion Questions
- Can you find a specific example of research that incorporates any or all of the four principles above?
- Can you think of a way these principles could be incorporated into a research project? How would the project change?
- Considering a research topic of interest to you, how could you incorporate any or all of the principles of justice-centered science into your research on that topic?
To advance justice-centered science, researchers should be trained on justice-based ethics, ensuring that all research is inclusive, engages communities, and distributes benefits fairly. Students and early career researchers should be encouraged to develop a personal research code of ethics based on the principles above as part of their education and training. Additionally, research funders and reviewers should require a research plan that includes explicit measures planned to ensure the application of justice-centered ethical principles. Incorporating these values into the culture of research is one way to instill them in people [Shamoo and Resnik, 2015]. Developing and enforcing institutional policies that accommodate justice-centered research practices also need to be included. For example, an institution could require that ethical review boards assess elements of social justice when considering research proposals [Macfarlane, 2009]. Practices could include encouraging and facilitating partnerships between researchers and community-based organizations, as well as collaborative approaches to research, such as community-based participatory research, which provides an example of how researchers can partner with communities for knowledge generation [Minkler and Wallerstein, 2008]. Finally, it is vital to monitor and evaluate projects to ensure that justice-centered research principles are properly implemented.
Key features of a code of ethics approach include an emphasis on inclusivity, community engagement, equitable benefit distribution, and transparency in order to generate research that is not just scientifically robust but also socially just. Such a paradigm shift in research ethics is necessary to meet the challenges of our time and direct scientific progress to create a more just society.
An example research code of ethics is available from the TRUST Code.
1.7. Conclusions
Science, in the Western tradition, is a process of discovery aimed at understanding natural phenomena through critical thinking and skepticism, using the scientific method. However, this method, while valuable, can be rigid and may overlook social complexities. Historically, science has both advanced society and caused harm, particularly to marginalized communities, as seen in unethical experiments and environmental injustices. Justice-centered science is a response, integrating social justice into research by promoting inclusivity, community engagement, and equitable distribution of benefits. This approach emphasizes the need for ethical research practices that recognize and address past and present injustices, ensuring that scientific advancements benefit all communities equitably.
Key elements of practicing justice-centered science include applying an expansive code of ethics that recognizes and eliminates the harms that science can inflict; recognizing and acknowledging the ways science has caused harm; recognizing that science is influenced by individual, social, political, and cultural contexts; and applying participatory research practices that create more equitable science.
References
Dosemagen, S., and A. Parker (2019), Citizen science across a spectrum: Broadening the impact of citizen science and community science, Sci. Technol. Stud., 32(2), 24–33, https://doi.org/10.23987/sts.60419.
Fan, P., M. S. Cho, Z. Lin, and E. F. Moran (2022), Recently constructed hydropower dams were associated with reduced economic production, population, and greenness in nearby areas, Proc. Natl. Acad. Sci. U. S. A., 119(8), e2108038119, https://doi.org/10.1073/pnas.2108038119.
Guba, E. G., and Y. S. Lincoln (1994), Competing paradigms in qualitative research, in Handbook of Qualitative Research, edited by N. K. Denzin and Y. S. Lincoln, pp. 105-117, Sage, Thousand Oaks, Calif.
Israel, B. A., E. Eng, A. J. Schulz, and E. A. Parker (Eds.) (2012), Methods for Community-Based Participatory Research for Health, John Wiley, Hoboken, N.J.
Jones, J. H. (1993), Bad Blood: The Tuskegee Syphilis Experiment, Free Press, New York.
Kitcher, P. (2001), Science, Truth, and Democracy, Oxford Univ. Press, New York, https://doi.org/10.1093/0195145836.001.0001.
Liboiron, M. (2021), Pollution Is Colonialism, Duke Univ. Press, Durham, N.C., https://doi.org/10.1215/9781478021445.
Macfarlane, B. (2009), Researching with Integrity: The Ethics of Academic Enquiry, Routledge, London, https://doi.org/10.4324/9780203886960.
Mahmoudi, D., Hawn, C. L., Henry, E. H., Perkins , D. J., Cooper, C. B., & Wilson, S. M. (2022). Mapping for Whom? Communities of Color and the Citizen Science Gap. ACME: An International Journal for Critical Geographies, 21(4), 372–388. https://doi.org/10.14288/acme.v21i4.2178.
Mazzocchi, F. (2006), Western science and traditional knowledge: Despite their variations, different forms of knowledge can learn from each other, EMBO reports, 2006(7), 463-466, https://doi.org/10.1038/sj.embor.7400693.
Minkler, M., and N. Wallerstein (Eds.) (2008), Community-Based Participatory Research for Health: From Process to Outcomes, 2nd ed., Jossey-Bass, San Francisco, Calif.
Mitchell P.W. (2018) The fault in his seeds: Lost notes to the case of bias in Samuel George Morton’s cranial race science. PLoS Biol 16(10): e2007008. https://doi.org/10.1371/journal.pbio.2007008.
Morales-Doyle, D. (2017), Justice-centered science pedagogy: A catalyst for academic achievement and social transformation, Sci. Educ., 101(6), 1034–1060, https://doi.org/10.1002/sce.21305.
National Commission for the Protection of Human Subjects of Biomedical and Behavioral Research (1979), Belmont Report: Ethical Principles and Guidelines for the Protection of Human Subjects of Research, Dep. of Health, Educ., and Welfare, Washington, D.C., https://www.hhs.gov/ohrp/regulations-and-policy/belmont-report/read-the-belmont-report/index.html
Needleman, H. L. (1993), The current status of childhood low-level lead toxicity, Neurotoxicology, 14(2–3), 161–166.
Ottinger, G. (2013), Refining Expertise: How Responsible Engineers Subvert Environmental Justice Challenges, New York Univ. Press, New York.
Pulido, L. (2016), Flint, environmental racism, and racial capitalism, Capitalism Nat. Socialism, 27(3), 1–16, https://doi.org/10.1080/10455752.2016.1213013.
Ramirez-Andreotta, M. D. (2019), Environmental justice, in Environmental and Pollution Science, 3rd ed., pp. 573–583, Academic, London, https://doi.org/10.1016/B978-0-12-814719-1.00031-8.
Resnik, D. B. (2015), The Ethics of Science: An Introduction, Routledge, London.
Resnik, D. B. (2018), The Ethics of Research with Human Subjects: Protecting People, Advancing Science, Promoting Trust, Springer, Cham, https://doi.org/10.1007/978-3-319-68756-8.
Rocha Pompeu, C., F. J. Peñas, A. Goldenberg-Vilar, M. Alvarez-Cabria, and J. Barquin (2022), Assessing the effects of irrigation and hydropower dams on river communities using taxonomic and multiple trait-based approaches, Ecol. Indic., 145, 109662, https://doi.org/10.1016/j.ecolind.2022.109662.
Schiebinger, L. (2008), Gendered Innovations in Science and Engineering, Stanford Univ. Press, Stanford, Calif., https://doi.org/10.1515/9781503626997.
Shamoo, A. E., and D. B. Resnik (2015), Responsible Conduct of Research, 3rd ed., Oxford Univ. Press, New York.
Skloot, R. (2010), The Immortal Life of Henrietta Lacks, Crown Publishers, New York.
Smith, L. T. (1999), Decolonizing Methodologies: Research and Indigenous Peoples, Zed Books, London.
Tagalik, S., K. Baker, J. Karetak, and J. Rahm (2023), Rebuilding relations and countering erasure through community-driven and owned science: A key tool to Inuit self-determination and social transformations, J. Res. Sci. Teach., 6(8), 1697–1722, https://doi.org/10.1002/tea.21881.
Wade, L. (2022), The Ghosts in the Museum: Anthropologists are reckoning with collections of human remains – and the racism that built them, Science, https://doi.org/10.1126/science.abk3522.