If there’s one lesson we can take away from technology and recent news cycles, it’s that the internet is severely broken.
Defending and repairing its integrity — particularly by limiting the spread of disinformation — is a monumental undertaking, but New Knowledge is up to the challenge. Since 2015, they’ve dedicated their company’s work to detecting and disrupting social media campaigns that have absolutely dire consequences for people, brands and organizations around the globe.
We spoke with three members of the New Knowledge team who are scaling efforts to find disinformation on the web — and eliminate it for good.
EMPLOYEES: Around 50, 45 of which are based in Austin
WHAT THEY DO: Simply put, New Knowledge defends information integrity. Serving high-profile brands and organizations, they detect and disrupt disinformation campaigns to prevent them from manipulating the public.
WHERE THEY DO IT: Austin
“DEFEND THE INTERNET”: It’s the company’s motto — the team is on the front lines of the internet every day working to stop the spread of disinformation.
IDEAL CANDIDATE: “Someone who can experiment and share what’s worked, or what didn’t work, with passion, excitement and humility,” says New Knowledge’s Emily Brodman.
Chip Young, Data Architect
Chip plays a key part in building world-class data infrastructure at New Knowledge and is also responsible for setting the company’s technical vision and roadmap.
BEYOND WORK: Austin is a great city, but Chip loves to visit new and exciting places. Just last year, Chip traveled to Cozumel, Spain, Portugal, Italy, and Costa Rica.
New Knowledge prides itself on being a mission-driven company. Where does your team fit into the mission of defending narrative authenticity and information integrity?
We provide the data that makes it possible for our data scientists, computational disinformation analysts and engineers to detect manipulative narratives and monitor factions of hyperactive accounts, enabling our customers to make strategic decisions. We’re anthropologists of the internet deciphering the vagaries and vulnerabilities of our information ecosystem.
What’s the largest looming challenge in the disinformation arena?
Defending and preserving brand integrity is now a major concern for many corporations, especially ones that take stands on cultural issues. We work to actively identify influence campaigns early, track them, and work with our customers to diminish or disrupt the effects of these campaigns. The future event horizon of the merging of increasingly sophisticated micro-targeting with AI content generation — where an algorithm can create content precisely tailored to manipulate a specific individual — is a dystopian singularity we must avoid. Solving the immediate challenges of disinformation are a prerequisite to preventing that future state.
We’re anthropologists of the internet deciphering the vagaries of our information ecosystem.”
Do you think that the work you do is changing the way we interact with and digest information online? How?
I do. We’ve significantly raised awareness of disinformation campaigns and false narratives through efforts, such as the report we did for the Senate Select Committee on Intelligence on Russian attempts to influence the 2016 election. We’ve led the way in how to identify, analyze and respond to influence campaigns. The work that we, as well as other organizations and individuals in this space, have done has led to a growing public awareness of these problems and the threat they pose. One consequence of this raised awareness has been an effort by social media companies to deal with these issues and take action instead of looking the other way.
Emily Brodman, Director of Product
Emily leads the product team. She identifies and defines the problems the product team needs to solve to best serve New Knowledge’s clients, and enables her team to be successful.
BEYOND WORK: The parallels between cooking and baking and product management aren’t too far off: Emily enjoys experimenting in the kitchen, building something out of nothing and sharing her creation with others.
With previous roles at Google and Cratejoy, what attracted you to New Knowledge?
Before New Knowledge, I built and supported ad tech products at Google and small business software at Cratejoy. I’ve spent most of my career helping businesses understand and make sense of data, and it’s no different at New Knowledge. We’re educating our users about a new way to think about their brand, about social and online media, and about the internet.
Since joining the team eight months ago, what has impressed you or surprised you most about New Knowledge?
I’ve been so impressed by how creatively our product and engineering teams tackle analyzing the way in which narratives spread and evolve online. We come from so many different backgrounds — the intel community, academia, big data and startups — and it means everyone is constantly pushing each other to come up with a better, more interesting solution. Recently, a few of us across data science, product and engineering sat down to talk through a challenging problem one of our users had and each of us approached it from our own angle.
We had data scientists and researchers who related it to counterterrorism problems they’ve seen before. Other engineers or analysts compared it to academic research they’ve read or participated in before, and I saw it through my background in ad tech and data-driven marketing. We ended up finding a lot of commonalities and unique approaches and ended up with a solution that none of us would have developed on our own.
We’re educating our users about a new way to think about their brand, about social and online media, and about the internet.”
What are you most excited about for the future of New Knowledge?
I’m excited to build the language, the tools, and the methods for explaining how narratives shift and how language and distribution can be manipulated, but building it isn’t enough. It’s exciting that New Knowledge is looking at it from all sides: partnering with election integrity organizations and researchers, helping brands and organizations understand this problem, and partnering with press and platforms so everyday people can see the internet the way we do.
Numa Dhamani, Machine Learning Engineer
As a member of NK Labs Research Team, Numa drives emerging machine learning and AI research, which informs New Knowledge with which tools it needs to help companies and organizations fight disinformation online.
BEYOND WORK: A proud Etsy store owner, Numa creates custom, hand-painted shoes and other knick-knacks. The pride of being a small business owner has made her a more thoughtful leader and communicator.
Tell us about the NK Labs team. How is your team working to combat disinformation?
New Knowledge’s research division, NK Labs, is responsible for driving discoveries in emerging artificial intelligence and machine learning research through open-source DARPA programs — Data-Driven Discovery of Models (D3M) and Active Social Engineering Defense (ASED) — which we use internally to build tools to fight disinformation online, in collaboration with the product data science team.
With my team, we have proven to be a key participant in the two DARPA programs, and have contributed our research — network analysis, time series, anomaly detection, natural language processing and computer vision — toward building New Knowledge’s product. We have also engaged in numerous conversations with industry leaders regarding an ethical framework for AI through Data for Democracy, and have presented our research at conferences and local community events as well as through academic papers and blog posts.
Malicious actors have the ability to construct and disseminate false narratives [...] Social media is an amplifier, but we’re fighting back.”
What tools does your team currently use? How do these tools enable your team to work efficiently and successfully?
My favorite tools that we use on a daily basis are Python, Docker, and GitHub. Python has an extensive selection of libraries and frameworks for machine learning and deep learning, and in many cases, these packages are written specifically for Python.
Docker essentially solves the issues that get in the way of teams or multiple people working together by allowing applications to be isolated into containers with specific instructions so they can be ported from machine to machine — it’s a game changer for working efficiently and successfully. GitHub is a hosting service for Git, a version control system, which helps a group of developers work collaboratively on projects by keeping track of changes or being able to easily merge code.
What’s a major challenge facing the dissemination of disinformation? What are you doing to help your team overcome it?
Social media platforms that we all know and use — Facebook, Instagram, Twitter, and more — are designed for virality, not civil discourse. Malicious actors have the ability to construct and disseminate false narratives, which can be liked and retweeted or shared by thousands of other users. Social media is an amplifier, but we’re fighting back. We’re working on fundamental machine learning and artificial intelligence research in this space to build state-of-the-art technologies to proactively monitor and detect coordinated disinformation campaigns on social media platforms.