Bridging Economics and Technology: My Internship Experience
- Diya Gumaste

- Sep 3
- 5 min read

When I first heard that majoring in economics was really about learning how to think, I wasn’t sure what to make of it. Was I really paying to “learn how to think”? I loved economics, and I appreciated the concepts, theories, and their real-world applications, but I couldn’t fully connect that advice with where I saw my academic path leading. This summer, though, during my internship in the Technology and Innovation (T&I) office at Red Cedar, that message finally clicked. I discovered that economics doesn’t just teach you about supply and demand or exchange rates—it equips you with a mindset, a structured way of thinking, and a problem-solving framework that can be applied far beyond the classroom.
From Economics to Innovation
Economics has applications everywhere. From the smallest choices, buying an apple instead of a pear, to massive, global interactions such as currency fluctuations, economics provides a lens to understand decision-making and trade-offs. One surprising place I discovered where this mindset is in high demand is technology. Walking into Red Cedar’s T&I office, I wasn’t sure how my economics background would fit. But I quickly realized that the analytical reasoning, patience with data, and comfort with ambiguity that economics fosters are exactly the traits needed to succeed in innovation-driven environments.
Hands-On Learning in Excel and Python
As a student, it is often difficult understanding how the things we learn in the classroom are used in day-to-day jobs. As an intern in the T&I office, I was given the opportunity to use some of those tools. I gained experience in Excel and Python, and it was informative learning how some of these skills translate to data development and the broader data science lifecycle. I enjoyed working with these tools and learning about data development. The plan was finishing the data science lifecycle and visualizing the data in Tableau, but we ran out of time.
The most daunting task I was given this summer was replicating a mean-win-rate formula from a website. The purpose of the formula was determining which AI model performed the best, given [Accuracy] and [Efficiency] across scored subjects. I had no idea how to approach this task, but with the encouragement to use AI as a tool (ChatGPT), I was able to create an accurate formula in Excel. I started in Excel, where I could visualize the calculation process clearly. This was critical because Python, while powerful, was still relatively new to me. Excel allowed me to test the formula, confirm it produced the correct outputs, and understand each part of the logic before translating it into code. The website provided the mean-win-rate for [Efficiency], so I was able to confirm my formula in Excel worked. I then needed to alter the Excel formula to work for [Accuracy].
Here’s the Excel formula I used:
=LET(
rng, $C$2:$BH$80,
r, ROW()-ROW($C$2)+1,
wins, SUM(BYCOL(rng, LAMBDA(c, COUNTIF(c, "<"&INDEX(c, r))))),
ties, SUM(BYCOL(rng, LAMBDA(c, COUNTIF(c, INDEX(c, r)) - 1))),
comps, SUM(BYCOL(rng, LAMBDA(c, COUNTA(c) - 1))),
IF(comps=0, "", (wins + 0.5 * ties) / comps)
)Breaking It Down
Now that I had the right numbers, I had to create a Python Code that would produce the same numbers so the code could be used in a larger project. This was a major step outside my comfort zone, but also one of the most rewarding parts of the summer. With guidance from ChatGPT and some trial and error, I built the following code:
import pandas as pd
import numpy as np
from pathlib import Path
# File paths
FILE_IN = Path("Accuracy III.xlsx")
FILE_OUT = FILE_IN # Overwrite same file
# Load the data
df = pd.read_excel(FILE_IN, sheet_name=0) # or use sheet_name="Sheet1"
# Identify metric columns (numeric and not 'Model')
metric_cols = [col for col in df.columns
if col != "Model" and pd.api.types.is_numeric_dtype(df[col])]
# Convert relevant part to NumPy for speed
X = df[metric_cols].to_numpy(dtype=float)
n_models, n_metrics = X.shape
# Initialize
wins = np.zeros(n_models, dtype=float)
ties = np.zeros(n_models, dtype=float)
comps = np.zeros(n_models, dtype=float)
# Calculate win, tie, comp per model
for j in range(n_metrics):
col = X[:, j]
valid = ~np.isnan(col)
idx = np.where(valid)[0]
if len(idx) <= 1:
continue
col_v = col[valid]
diff = col_v[:, None] - col_v[None, :]
win_count = (diff < 0).sum(axis=1)
tie_count = (np.abs(diff) < 1e-8).sum(axis=1) - 1 # exclude self
# Update aggregates
wins[idx] += win_count
ties[idx] += tie_count
comps[idx] += len(idx) - 1
# Final win rate (avoid divide by zero)
win_rate = np.divide(
wins + 0.5 * ties,
comps,
out=np.full(n_models, np.nan),
where=comps > 0
)
# Attach back to DataFrame
df["Mean win rate"] = win_rate
# Save to same file (replaces the sheet!)
with pd.ExcelWriter(FILE_OUT, engine="openpyxl", mode="a", if_sheet_exists="replace") as writer:
df.to_excel(writer, index=False, sheet_name="Sheet1")
print(f"✅ Done. Wrote mean win rates to {FILE_OUT.resolve()}")
Working through this was intimidating, but also empowering. Seeing my Python code run correctly for the first time and producing the exact numbers I had tested in Excel was a breakthrough moment. It showed me that even with limited experience, I could tackle complex technical problems by approaching them step by step.
Observing Tableau in Action
After building the data foundation, the next step was visualization in Tableau. Although I didn’t have time to directly work in Tableau myself, I observed how the data I had helped prepare was transformed into visual models. This gave me a window into how technical work becomes accessible, readable, and useful to decision-makers. Watching this process sparked my curiosity about visualization tools, and I look forward to learning more in the future.
Lessons Learned
Looking back, my internship was more than just an introduction to new tools. It was a lesson in how to approach the unknown. Economics had already taught me to analyze problems, weigh trade-offs, and think logically. But in the T&I office, I learned to apply that mindset to coding, data analysis, and problem-solving in unfamiliar areas. Using AI tools like ChatGPT wasn’t about taking shortcuts, it was about accelerating learning and building confidence.
Most importantly, I learned that the real value of my economics background isn’t just the knowledge...it’s the way of thinking. That way of thinking gave me the courage to work through Excel formulas, write my first real Python code, and begin to understand the role of data visualization. And it reminded me that adaptability, curiosity, and problem-solving are as important as technical skills.
Conclusion
This summer gave me a head start on the data science journey I plan to continue in the classroom over the next few years. It also reinforced my passion for economics, showing me how it connects to fields I had never considered. Economics isn’t just about markets or statistics; it is a gateway to diverse career opportunities. By combining the structured way of thinking economics provides with new technical skills, I feel better prepared to contribute meaningfully in a world where technology and innovation are reshaping how we work and think. In the end, what I once doubted...that economics is about learning how to think...has proven to be one of my greatest assets.




Diya, I am so glad to hear that your experience at RCC shed insight into how economics aligns with enterprise strategic goals. Today’s Modernization efforts demand improved utilization and value delivery of business, operational and technology investments. Your skillset and gained insights will contribute to your success.
Regarding the use of AI (ChatGPT), Python, and Excel, this is the new norm to work faster. Your experience at RCC, however, took you to the next level, where you used the tools for "analysis" and to "clean" data. This and your economic background open the doors to better understanding and even potential new findings in economic matters; and that makes for an exciting career.
Economics make or break organizations. I am…