Excel Interview Questions

Dan Lee's profile image
Dan LeeData & AI Lead
Last updateMarch 13, 2026

Excel interviews at top-tier firms like Goldman Sachs, JP Morgan, and McKinsey aren't about basic SUM formulas. Investment banks test your ability to build risk models under pressure, consulting firms want to see complex data manipulation for client deliverables, and tech companies expect you to handle million-row datasets without breaking a sweat. If you're targeting senior data analyst roles, expect live screen shares where you solve messy real-world problems in 15-20 minutes.

What makes Excel interviews brutal is the gap between knowing functions and solving business problems fast. You might know VLOOKUP, but can you debug why 200 customer IDs aren't matching when everything looks identical? You understand pivot tables, but can you explain to a VP why customer counts doubled when you switched from Count to Distinct Count? The best candidates don't just write formulas, they diagnose data quality issues, optimize performance, and build models that won't break when assumptions change.

Here are the top 31 Excel interview questions organized by the skills that separate senior candidates from beginners.

Beginner31 questions

Excel Interview Questions

Top Excel interview questions covering the key areas tested at leading tech companies. Practice with real questions and detailed solutions.

Data AnalystGoldman SachsJP MorganMcKinseyDeloitteAmazonMicrosoftCapital OneBCG

Core Formulas and References

Core formula questions reveal whether you can build production-ready models or just classroom exercises. Most candidates write formulas that work once, then break when copied down 1,000 rows or when columns get inserted. Interviewers watch for absolute vs relative references, proper use of mixed references, and formulas that handle edge cases without manual intervention.

The killer mistake is forgetting that real models get updated monthly with new data and shared across teams. Your VLOOKUP might work perfectly in the interview, but if it references a hardcoded range instead of a dynamic table, it's not enterprise-ready. Master mixed references like $F$1 for constants and approximate match lookups that don't break when your lookup table changes.

Core Formulas and References

Start here: you are tested on whether you can translate a business question into correct formulas using relative and absolute references. Candidates often struggle when copying formulas across ranges, handling blanks, and avoiding subtle logic errors under time pressure.

You are building a monthly revenue model. Column B has Unit Price, column C has Units, and cell F1 has the FX rate to convert to USD. Write a formula in D2 for USD Revenue that you can copy down 5,000 rows without breaking when you fill down.

Goldman SachsGoldman SachsEasyCore Formulas and References

Sample Answer

Most candidates default to a fully relative reference like =B2*C2*F1, but that fails here because F1 will shift to F2, F3, and so on when you copy down. You want price and units to move by row, but the FX rate to stay fixed. Use =B2*C2*$F$1 so only the row based inputs change. If you might copy across columns too, you still want the fully absolute $F$1 anchor.

Practice more Core Formulas and References questions

Lookup and Join Logic

Lookup questions test your ability to join messy real-world datasets where perfect matches are rare. In consulting and banking, you're constantly enriching transaction data with customer info, mapping product codes across systems, or pulling the latest ratings from time-series tables. The challenge isn't writing a VLOOKUP, it's handling data quality issues that cause joins to fail silently.

Smart candidates always validate their joins by checking row counts before and after, flagging unmatched records, and quantifying the business impact of missing data. A common trap is assuming clean data when CustomerID '00123' (text) won't match CustomerID 123 (number), even though they look identical on screen.

Lookup and Join Logic

In interviews, you will be asked to match records across tables, resolve duplicates, and handle missing keys using XLOOKUP, INDEX MATCH, or related patterns. You can lose time if you do not know how to control match behavior, return multi-column results, or validate join quality.

You have an Orders table with CustomerID and an old Customer List where CustomerID sometimes has leading zeros as text. You need to pull Segment into Orders and avoid mismatches like 00123 vs 123, how do you write the lookup and what match settings do you use?

JP MorganJP MorganMediumLookup and Join Logic

Sample Answer

Normalize the key on both sides and then use an exact-match XLOOKUP. Convert both IDs to a consistent representation, for example `=XLOOKUP(TEXT([@CustomerID],"00000"),TEXT(Customers[CustomerID],"00000"),Customers[Segment],"Missing",0)` so 123 and 00123 align. The `0` match_mode forces exact match so you do not silently pick a near match. Use the "Missing" not_found value so you can filter and audit join failures.

Practice more Lookup and Join Logic questions

Data Cleaning and Text Parsing

Data cleaning separates analysts who can work with real data from those who only handle textbook examples. Every CSV export has mixed date formats, hidden characters, and inconsistent text fields that need standardization before analysis. Investment banks care deeply about data quality because wrong dates or duplicate IDs can trigger compliance issues.

The most dangerous cleaning mistakes happen silently. Excel might auto-convert '03/07/2025' to March 7th when you meant July 3rd, or your text functions might miss non-breaking spaces that look like regular spaces but break joins. Always validate your cleaning logic with edge cases and spot-check the results manually.

Data Cleaning and Text Parsing

Expect messy exports: you need to standardize dates, split and combine fields, remove hidden characters, and dedupe reliably. Many candidates get tripped up by inconsistent formatting, locale-specific date parsing, and cleaning steps that break downstream analysis.

You receive a CSV export where the Order Date column is mixed, some rows look like "03/07/2025", others like "7-Mar-2025", and some are true Excel dates stored as numbers. How do you standardize everything into a real Excel date, without silently swapping day and month for EU formatted rows?

AmazonAmazonHardData Cleaning and Text Parsing

Sample Answer

You could do a formula based approach or Power Query. The formula approach is `=IF(ISNUMBER(A2),A2,DATEVALUE(SUBSTITUTE(A2,"-"," ")))` plus additional handling, but it is fragile when locale changes and when strings are ambiguous like 03/07/2025. Power Query wins here because you can explicitly set data types with locale, split ambiguous formats into separate parsing rules, and you get a repeatable refreshable pipeline. In PQ, you would detect numeric dates, parse text dates with the correct locale, then output a single typed Date column.

Practice more Data Cleaning and Text Parsing questions

Pivot Tables and Aggregation Analysis

Pivot table questions go beyond basic drag-and-drop to test your understanding of aggregation logic and data modeling. Senior roles expect you to handle complex scenarios like calculating customer-level metrics from transaction-level data, excluding certain records without filtering the source, or explaining why pivot results don't match manual calculations.

The biggest conceptual gap is between row-level and entity-level analysis. When calculating average order value, you need order-level aggregation first, then customer-level averages, not a simple average of all line items. Interviewers love asking about Count vs Distinct Count because it reveals whether you understand what you're actually measuring.

Pivot Tables and Aggregation Analysis

Pivot questions check whether you can summarize data quickly, add calculated fields, and create drilldowns that answer stakeholder questions. You might struggle if you rely on defaults, misunderstand grouping, or cannot reconcile pivot results to raw data.

You have a transactions table with Date, Region, Product, Revenue, and Units. A VP asks for Q2 revenue by Region with Product as a drilldown, and they want to exclude refunds that show up as negative Revenue. Build the pivot and explain how you would validate the totals against the raw data.

AmazonAmazonMediumPivot Tables and Aggregation Analysis

Sample Answer

Reason through it: First, you create a PivotTable from the full table range, then put Region in Rows and Revenue in Values as Sum. Next, you filter Date to Q2 using the pivot filter or group Dates by Quarters, then filter Revenue to show only values greater than 0, or better, add a RefundFlag field in the source and filter it out to avoid accidentally dropping legitimate negatives. Then you add Product under Region in Rows so you can expand and collapse for drilldown. To validate, you copy the pivot grand total for Q2 and compare it to a SUMIFS on the raw table using the same Q2 date bounds and refund logic, making sure the two numbers match exactly.

Practice more Pivot Tables and Aggregation Analysis questions

What-If Analysis and Financial Modeling

Financial modeling questions assess your ability to build flexible, scenario-ready models that executives can actually use for decision-making. At consulting firms and investment banks, your models need assumption toggles, sensitivity analysis, and clear separation between inputs, calculations, and outputs. The interviewer wants to see structured thinking, not just correct math.

Most candidates build models that work for one scenario but become unmaintainable when business assumptions change. Your DCF might be mathematically correct, but if discount rates are hardcoded inside NPV formulas instead of referenced from an inputs section, it's not executive-ready. Always design for multiple scenarios and easy assumption changes.

What-If Analysis and Financial Modeling

To evaluate business judgment, you will build mini models with assumptions, scenario tables, and sensitivity analysis, often on the fly. This is where candidates stumble by hardcoding inputs, mixing units, or failing to make the model auditable and easy to change.

You are modeling a subscription product for a leadership review. Build an Excel mini model where revenue depends on starting customers, monthly churn, and monthly new adds, and you can toggle between Base, Upside, and Downside assumptions without changing formulas.

AmazonAmazonMediumWhat-If Analysis and Financial Modeling

Sample Answer

This question is checking whether you can separate inputs from logic, avoid hardcoding, and make scenarios auditable. Put all assumptions in a clearly labeled Inputs block, then map Base, Upside, Downside values using a scenario selector cell plus CHOOSE or XLOOKUP. Model customers by month with $$\text{EndCustomers}_t = \text{StartCustomers}_t \times (1-\text{Churn}) + \text{NewAdds}$$ and link revenue to customers times ARPU. Use consistent units, for example churn as a monthly rate, and format assumption cells distinctly so an interviewer can change one number and see the entire model update.

Practice more What-If Analysis and Financial Modeling questions

Advanced Automation and Performance

Advanced questions test your judgment about when to use Excel vs other tools, and how to optimize performance when datasets get large. Tech companies and quantitative roles expect you to handle millions of rows efficiently, understand memory constraints, and know when Power Query beats formulas. These aren't just technical skills, they're strategic decisions about tool selection.

The key insight is that Excel has multiple engines with different strengths. Formulas are great for calculations but slow for large datasets. Pivot tables are fast for aggregation but limited for complex logic. Power Query excels at data transformation but requires different debugging skills. Senior analysts choose the right tool for each problem and explain their reasoning clearly.

Advanced Automation and Performance

Finally, you will be differentiated by whether you can scale workflows using Power Query, dynamic arrays, and structured tables while keeping files fast and maintainable. You can struggle if you cannot choose between formula solutions and query-based solutions, or if your workbook becomes fragile.

You receive a 1.2M row transaction extract monthly and need to produce a P&L pivot by desk and month with a refresh button. Would you build it with formulas and dynamic arrays, or with Power Query and the Data Model, and why?

Goldman SachsGoldman SachsHardAdvanced Automation and Performance

Sample Answer

The standard move is Power Query to load and shape the data, then the Data Model and a PivotTable for aggregation, because it scales and refreshes cleanly. But here, the exception is if you only need a small filtered slice and interactive what-if logic, dynamic arrays on an Excel Table can be faster to iterate on. Power Query is best when the transformation steps are stable and repeat monthly, and you want a single refresh to rebuild outputs. Keep formulas for last-mile presentation or scenario knobs, not for row-by-row transforms across 1.2M rows.

Practice more Advanced Automation and Performance questions

How to Prepare for Excel Interviews

Practice Building Models From Scratch

Don't just solve formula puzzles. Start with a business scenario, build the entire model including inputs, calculations, and outputs, then stress-test it with different assumptions. This mirrors real interviews where you get 20 minutes to build something complete.

Master Data Quality Diagnostics

Learn to spot hidden characters with LEN() and CODE(), use TRIM() and CLEAN() effectively, and validate data types with ISNUMBER(). Practice on messy real datasets, not clean textbook examples, so you recognize quality issues quickly during live interviews.

Build Your Reference Troubleshooting Toolkit

Practice switching between F4 states ($A$1, A$1, $A1, A1) without thinking, and trace formula errors using Ctrl+[ to find precedents. Interviewers notice candidates who debug efficiently vs those who stare at broken formulas.

Time Yourself on End-to-End Problems

Set 15-minute timers and solve complete business problems, not isolated functions. Real interviews have time pressure, so practice building models, checking your work, and explaining your approach within realistic timeframes.

Learn Power Query Basics for Large Datasets

Understand when Power Query beats formulas for data transformation, especially with 100K+ rows. Practice basic merge operations and data type conversions so you can discuss tool trade-offs intelligently during architecture questions.

How Ready Are You for Excel Interviews?

1 / 6
Core Formulas and References

You are building a sales report where each row is a product and each column is a month. You need a formula that you can fill across and down to calculate Revenue as Units times Price, where Units changes by row and month, and Price is stored in a fixed column for each product. Which approach is best?

Frequently Asked Questions

How deep does my Excel knowledge need to be for a Data Analyst interview?

You should be comfortable cleaning messy data, building pivot tables, writing common formulas, and creating clear charts. Expect intermediate functions like XLOOKUP or INDEX-MATCH, SUMIFS, COUNTIFS, IF, and basic text and date functions. You may also be asked to explain when you would use Power Query or Power Pivot and how you would validate results.

What types of companies ask the most Excel questions in Data Analyst interviews?

Companies with heavy reporting workflows tend to probe Excel the most, including finance, operations, logistics, and many mid sized businesses that run dashboards in spreadsheets. Consulting and analytics teams that deliver client ready outputs often test formatting, pivots, and data cleaning steps. Startups may ask less Excel if they are fully SQL and BI tool focused, but Excel still shows up for quick analysis tasks.

Do I need coding for an Excel focused Data Analyst interview?

You usually do not need programming to answer Excel questions, but you may be expected to translate analysis logic into formulas and pivots quickly. Some teams pair Excel with SQL or Python, so you might get a short coding screen alongside Excel. If coding is part of the role, practice at datainterview.com/coding and keep your Excel explanations tied to real analysis steps.

How do Excel interview questions differ across Data Analyst roles?

For business reporting analysts, interviews lean toward pivots, dashboards, conditional formatting, and stakeholder friendly tables. For product or marketing analysts, you will see cohort style calculations, date bucketing, and funnel metrics built with formulas or Power Query. For finance leaning analyst roles, expect modeling basics like scenario tables, sensitivity checks, and error proofing with structured references.

How can I prepare for Excel interviews if I have no real world experience?

Build a small portfolio of Excel files that show data cleaning, a pivot based summary, and a charted dashboard from a public dataset. Practice common tasks like splitting text, removing duplicates, handling blanks, and reconciling totals with SUMIFS checks. Use datainterview.com/questions to drill Excel interview prompts, then recreate the answers in a workbook so you can explain your steps.

What are the most common Excel mistakes candidates make in interviews, and how do I avoid them?

A common mistake is using the wrong references, forgetting to lock ranges with absolute references, which breaks formulas when copied. Another is relying on manual filters or copy paste instead of pivots, tables, or Power Query steps that are repeatable. You should also avoid not validating outputs, always do quick sanity checks like row counts, totals, and spot checks on a few records.

Dan Lee's profile image

Written by

Dan Lee

Data & AI Lead

Dan is a seasoned data scientist and ML coach with 10+ years of experience at Google, PayPal, and startups. He has helped candidates land top-paying roles and offers personalized guidance to accelerate your data career.

Connect on LinkedIn