Hi, I'm Vivek! (he/him)
that's [ viv-ake ]
I design data visualizations, dashboards, websites, and digital tools that communicate complex ideas in simple ways.
I believe in taking a user-centered approach when visualizing data. This involves lots of research, testing, and rapid prototyping. I also think it's critical to collaborate with developers while designing. I'm comfortable using Figma, d3.js, React, Tableau, PowerBI, Python, SQL, the Adobe Creative Suite, and more.
Outside of work, I'm passionate about biking, painting, journalism and ethical AI. Reach out to me if you'd like to chat about those things or about any of my projects below.
Oh, and one more thing. I love puzzles and do the New York Times' sudoku (almost) every day. I last did it on 10/31/2024
and finished it in 4:20
. If you'd like to learn more about my sudoku times, I wrote this essay!
First place, data visualization: Society of Professional Journalists
First/third place, interactive graphic: Columbia Scholastic Press Association
First place, best journalism website: Indiana Professional Chapter of the Society of Professional Journalists
Second place, informational graphic: Indiana Collegiate Press Association
Honorable mention, interactive graphic: Associated Collegiate Press
Figma Python React d3.js
This is a data-driven essay I wrote to analyze how I fill out the New York Times sudoku. For this piece, I created a browser extension to track how I fill the Times grid, built out a server to collect data, wrote Python notebooks for analysis, and used d3.js + React for data viz.
I also open-sourced my data and code. Take a look at the entire repo here.
Figma user testing Tableau AWS Python
I worked with a designer and a Tableau developer to help create two dashboards for a health agency. I designed and conducted user research for an internal dashboard that the agency uses to identify nursing homes that need intervention.
I also used AWS and Python to create a data pipeline that populated the dashboards. Read more about my design and development process here.
data wrangling Figma bivariate choropleths scrollytelling
For this piece, I worked with data scientists to identify counties in the U.S. that received less philanthropic aid relative to need. We used bivariate choropleths and scrollytelling to explain the project's methodology.
This piece was special to me because I was on the founding team for an affinity group in the McKinsey Boston office to promote social mobility.
data analysis dashboarding Plot.ly geocoding Mapbox
Analyzed utilities use for more than 100 structures on IU's campus. Created a JavaScript visualization that uncovered a fountain wasting 10 million gallons of water annually.
For this story, I learned how to ask hard questions. I was at the university's physical plant at least twice a week to press the assistant director on how they hadn't discovered the leak.
data analysis cron jobs Plot.ly Python web scraping
Built a web scraper to collect laundry use data across residence halls on campus to determine the optimal time to do laundry. Created a JavaScript heatmap to visualize this data.
Set up my first data pipeline and created a slackbot to alert the team if there was an issue while scraping.
creative direction deadlines responsive web layouts
Designed and developed a longform piece that uncovered how an IU professor was allowed to teach despite exhibitng “a concerning pattern of singling out certain students” and sexually harassing them.
This piece taught me the importance of coordination as I worked on it from a different country and time zone. Our team prioritized timeliness, communication and frequent check-ins. The professor resigned at the end of that week. Check out all of our stories here.
project management Adobe Creative Suite responsive web layouts
On this project, my team and I asked questions from “How big should this font be?” to “How do we give underrepresented voices more space on the homepage?”
In its first year, the redesigned website fetched $15,000 more in annual advertising revenue. Our focus was on making branding more consistent, optimizing ad revenue, updating fonts, and giving web layouts a facelift.
dimension reduction (UMAP) documentation Asana
My practicum project for my master's program, my team and I worked with a local school district to measure the effect of various tutoring interventions on student academic performance. We used difference-in-difference and 2SLS models.
My role on the team was to use dimension reduction to identify different student groups based on learning and demographic features. Our paper and poster were featured in the Midwest Decision Sciences Institute (MWDSI) and 2022 INFORMS Business Analytics conferences.
fine-tuning LLMs scikit-learn nltk web scraping
Fine-tuned OpenAI's GPT-2 to simulate speeches and also visualized insights through responsive graphics.
My first foray into natural language processing, I learned a lot about machine learning and best practices when writing code. I got to work with dimensionality reduction, string analysis, data cleaning and more.
Python cartography data viz
Created and managed Indiana's first coronavirus tracker — even before the state health department.
I later used Python to automate the map by setting up a data pipeline. I also posted daily updates to the newsroom's Twitter and Facebook.