The Shift in Perspective
Usually, my research process starts with brainstorming a few ideas and using generative AI to expand them into various frameworks. This helps me test the feasibility of my thoughts through further dialogue with the tool. Once I have a workable plan, I look for academic support through Google Scholar, CNKI, and the Leeds Library.
However, reflecting on the Week 15 readings, I’ve realized that despite the efficiency of this method, it has significant downsides that warrant a critical re-evaluation of my workflow.
Identifying the Risks
Upon review, I have identified four major methodological issues related to over-reliance on AI tools:
- Echo Chambers: AI hallucinations can stifle critical thinking because the model often echoes or "flatters" the user’s perspective rather than challenging it with necessary friction.
- Systemic Bias: The Western-centric nature of AI training data often results in biased outputs that lack diverse, non-hegemonic, or innovative viewpoints.
- Linguistic Erosion: For a student and English learner, over-reliance on AI can subtly reshape the way I organize my language and hinder the authentic development of my personal style.
- Stifled Innovation: The interactivity of AI is based on LLM mechanisms. Instead of boosting innovative characteristics, it risks limiting my thinking by returning results organized by existing data—which may be constituted by large bias and discrimination.
Strategic Pivot: Qualitative Focus
The reading on digital methods was particularly enlightening. It provided clear examples of how to conduct research digitally without coding knowledge and highlighted why these methods are so distinctive in the digital era.
Consequently, considering the constraints of time and energy, I have decided to prioritize qualitative methods over quantitative ones, as they align better with my current skills and project goals.
FIELD NOTE: Algorithms & Prejudice
Fortunately, inspired by the method of “googling black girls”, I started to search "Asian" and "Asian people" in Yahoo and Google.
The results were striking. I found advertisements for dating apps targeted at middle-aged males with titles referencing "Asian girls," alongside a disproportionate representation of Asian women in the general category of "Asian people."
This phenomenon raised my interest, as the search engine—fundamentally a kind of advertising system—commonly uses similar mechanisms for profit. Thus, it may reveal a systemic form of racism or sexism embedded in algorithmic structures which needs to be further researched.