This is an ongoing post with my thoughts on how AI is changing how we learn and how to keep your skills relevant.
Andrej Karpathy, a widely respected AI researcher and educator, said something very insightful and offers a useful mental model.
“You can outsource your thinking, but you can’t outsource your understanding.”
AI Tools are accelerating how we build software. For data scientists, it means we can easily build scripts, notebooks, plugins and data pipelines for analysis. Here are some of my thoughts on what this means for you.
- Domain expertise matters more, not less
- If you are trying to build satellite imagery analytics for helping farmers, a deep understanding of agricultural practices, socio-economic constraints, along with the capabilities of remote sensing data is critical. The implementation of calculating NDVI from a Sentinel-2 image can be outsourced to an AI model, but knowing exactly how that data will help the farmer is important for you to understand.
- Similarly, if you are tasked with building a data processing pipeline, writing an SQL query to extract a subset from a Parquet file can be outsourced to an AI model – but knowing the use-cases and understanding of data formats is still important. For example, understanding the nuances of the GeoParquet format would be critical to design a cost-effective solution for large datasets.
- With so many new datasets being released, finding and picking the right dataset is now a core skill. All datasets have inherent limitations and you need to be able to ask the right questions to decipher the nuances. For example, Google recently released a large flood archive data called GroundSource. This is a useful data source, but unless you spent time understanding how the data was generated and evaluating it for your specific task, it is easy to misuse the data and grossly over-estimate the flood events.
- Your job shifts from building to directing and validating
- When AI can rapidly generate solutions, the higher-value skill becomes identifying which problems are worth solving.
- Instead of writing code and carrying out the analysis – now your job is to guide the process. You need to identify the problem to solve, pick the right dataset, guide the model to pick the appropriate stack, and evaluate the results.
- It is making it easy to do deep analysis or work on projects that were simply too hard or would take too long. A recent participant from our Python course was able to carry out a project that used her expertise in mineral exploration and risk assessment to build a sophisticated data-driven simulation. AI-coding tools helped her build this without a lot of coding experience, but her domain expertise and understanding are what enabled the outcome.
- Be deliberate about what you think about
- As AI tools get better at many tasks, you now have the freedom to choose what to think about. You still need to think deeply about topics that you care about – but at the same time, automate and outsource tasks that take up your time and mental bandwidth without adding value to your core task.
- For example, I am now using AI tools to automate tasks around running my business and streamlining logistics, which is freeing up my time to think deeply about how I can help people learn advanced skills.
- When I am working on a data analysis project, I use AI to write the boilerplate code and do data cleaning, which allows me to spend more time handling edge cases, validating assumptions, and thinking more deeply about the problem itself.
AI raises the value of your domain expertise and judgement. The nitty-gritties of implementation can be outsourced, while your focus should be on developing a deep understanding of the problem space.
How are you dealing with changes due to advances in AI? Do you agree with the views? I would love to know your thoughts and counter-arguments in the comments.
