Miller's research focuses on applying computational techniques to shed insight on basic communicative and textual phenomena such as discourse and narrative. He has worked at the intersection of large-scale textual data, interactive textual and visual systems, the domain of human rights, and the possibilities of new modes of analysis and communication dependent upon computation.
No Jargon Podcast
In the News
Describes (1) best practices for studying and moderating toxicity, (2) conceptual and legal frameworks for addressing hate speech, dangerous and toxic speech, (3) patterns of toxic language in online media, (4) next steps for building a reference corpus of toxicity types and a descriptive taxonomy, and (5) a humanistic perspective on consequences of toxicity and its moderation procedures. Documents how practical anonymity in online communication has changed standards for interpersonal language and the most damaging of those changes.
Discusses how cross-document event chain co-referencing in corpora of news articles would achieve increased precision and generalizability from a method that consistently recognizes narrative, discursive, and phenomenological features such as tense, mood, tone, canonicity, and breach, person, hermeneutic composability, speed, and time. Approaches this task using event segmentation, word embeddings, and variable length pattern matching in a corpus of 2,000 articles describing environmental events.
Extracts information from 503 World Trade Center Task Force interviews comprising 12,000 pages of testimony and novel visualization techniques. Proposes a computational method for the emergence of narratives that cross beyond the boundaries of one interview.
Links scholarly and industry investigations into the application of Natural Language Processing (NLP) techniques and tools to advance current human rights violation research. Began in 2012 to expand models for information extraction from witness statements and government reports, for visualizing that data to facilitate better event understanding, for understanding how researchers and investigators used computational methods in their human rights work, and for developing methods to model speakers' expressed certainty of their statements.