Education & Training
Formal Education
- Ph.D. in Zoology and Ecology, James Cook University — Cum laude (2024)
- Skills & Tools: Bayesian and frequentist inference, hierarchical modelling, machine learning, predictive modelling, spatial analysis (GIS, remote sensing), automated data pipelines, data cleaning, R programming, reproducible workflows.
- Data Types: Multi-source ecological, biological, climatic, physiological, and biogeochemical datasets (soil, foliage chemistry).
- Applications: Developed novel modelling frameworks to predict species vulnerability to extreme events and identify high-risk habitats.
- M.S. in Biology and Conservation of Biodiversity, Universidad de Salamanca (2016)
- Skills & Tools: GIS, advanced statistics, applied statistical modelling, R programming, spatial analysis, workflow automation.
- Applications: Designed and executed analytical workflows for biodiversity monitoring and conservation planning.
- B.S. in Biology, Universidad de Salamanca (2014)
- Skills & Tools: Mathematics, algebra, biostatistics, physics, introductory statistical programming, ecological modelling.
- Applications: Undergraduate research project integrating environmental and ecological data.
Statistical, Computational & Coding Training
Bayesian and hierarchical modelling
- Statistical Rethinking: A Bayesian Course with Examples in R and Stan
- Statistical Rethinking 2023 - Online Course by Richard McElreath
- Bayesian Methods for Ecology by Michael A. McCarthy
- Applied Hierarchical Modeling in Ecology
- Integrated Population Models
- ‘Statistics in R’ workshop with Dr Murray Logan (biostatistician at the Australian Institute of Marine Science)
Coding in R
- R for Data Science
- ‘Statistics in R’ workshop with Dr Murray Logan
- Completed a structured, workshop-based introduction to R for statistical analysis, covering core language fundamentals and the full workflow from data handling to modelling.
- Built proficiency with R basics, including objects, vectors, indexing, functions, and working efficiently in an interactive coding environment.
- Developed practical skills for handling real datasets using data frames, vectorised operations, and tidy data principles, with an emphasis on clear, reproducible analysis.
- Trained in code management and versioning practices and implemented reproducible research workflows using R Markdown.
- Applied modern data-science tooling in R, including data wrangling and high-quality visualisation with ggplot2.
- Strengthened statistical foundations through introductory principles and basic inference testing (hypothesis testing and interpretation).
- Extended modelling capability across a broad family of methods, including linear models, mixed-effects models, generalised linear (mixed) models, non-linear / GAM frameworks, and multivariate analyses, with exposure to both frequentist and Bayesian implementations.
- Reinforced professional practice by consistently explaining the purpose of each step (syntax → data transformation → model → interpretation), supporting readable, transparent, and reusable analysis pipelines.
Coding in Python & Machine learning
- The Complete Python Bootcamp: From Zero to Hero in Python (Click to view certificate)
- Completed a comprehensive, hands-on course covering the full Python stack from beginner fundamentals to advanced programming concepts.
- Built a strong professional foundation in syntax, data structures, control flow, functions, and Object-Oriented Programming (OOP) through practical exercises and projects.
- Developed the ability to write clean, efficient, and maintainable code, applying Python across automation, scripting, data handling, and application development.
- Explored Python’s standard library, automation patterns, file processing, and interactive workflows through end-to-end mini-projects.
- Core skills acquired:
- Python essentials: variables, data types, operators, loops, conditionals, and functions.
- Intermediate and advanced concepts: decorators, debugging, error handling, regex, and iterators.
- Functional and Object-Oriented Programming (classes, inheritance, encapsulation).
- Working with key modules:
collections,datetime,os,math,random,re, and file I/O. - Handling files, images, PDFs, CSVs, and automating repetitive workflows.
- Web scraping with BeautifulSoup and Requests.
- Email automation and productivity scripting.
- Building and debugging in both Jupyter notebooks and standalone
.pyfiles. - Creating interactive mini-projects and games (e.g., Blackjack, Tic-Tac-Toe).
- Introductory GUI development and interactive notebook elements.
- Applying Python to real-world scenarios, from automation to small production-ready projects.
- Python for Data Science and Machine Learning Bootcamp (Click to view certificate)
- Completed a comprehensive, project-based introduction to data science and machine learning in Python, covering the full modelling workflow from data cleaning to evaluation and interpretation.
- Built practical experience across the modern Python data stack (NumPy, pandas, Matplotlib, Seaborn, Plotly) and implemented supervised, unsupervised, and introductory deep-learning models.
- Core skills acquired:
- Data wrangling and numerical computing with pandas and NumPy.
- Exploratory and statistical visualisation using Matplotlib, Seaborn, and Plotly.
- Machine learning with scikit-learn, including:
- Supervised learning: Linear/Logistic Regression, k-NN, Decision Trees, Random Forests, Support Vector Machines, feed-forward Neural Networks.
- Unsupervised learning: K-Means Clustering, PCA (dimensionality reduction).
- Introductory NLP (text cleaning, tokenisation, vectorisation with CountVectorizer/TF-IDF) and Naive Bayes classification.
- Introductory Recommender Systems (similarity metrics and collaborative filtering).
- Foundations of Deep Learning (activation functions, simple network training with Keras).
- Model evaluation, cross-validation, train/test workflows, bias–variance analysis, and performance optimisation.
- Exposure to big-data concepts with Spark, including handling distributed datasets in Python.
- Machine Learning Specialisation, DeepLearning.AI (Andrew Ng) (Click to view certificate)
- Completed a foundational specialisation on machine learning fundamentals and applied techniques designed for practitioners entering AI and data science. The curriculum emphasises both conceptual understanding and implementation using Python and core libraries.
- Developed proficiency in supervised learning, advanced algorithms, unsupervised learning, recommender systems, and introductory reinforcement learning. Tools and environment include Python, NumPy, scikit-learn, and TensorFlow concepts.
Module 1 — Supervised Machine Learning: Regression and Classification (Click to view certificate)
- Built and trained supervised models for prediction and classification tasks, including linear regression, logistic regression, feature scaling, model evaluation, and optimisation concepts.
Module 2 — Advanced Learning Algorithms (Click to view certificate)
- Explored deeper models and learning techniques such as neural networks, decision trees, ensemble logic, and optimisation strategies, with practical guidance on algorithm application.
Module 3 — Unsupervised Learning, Recommenders, and Reinforcement Learning (Click to view certificate)
- Covered clustering (e.g., K-means), anomaly detection, recommender systems (collaborative and content-based), and introductory reinforcement learning concepts.
SQL & Databases
- The Complete SQL Bootcamp: PostgreSQL & pgAdmin (Click to view certificate)
- Hands-on training in PostgreSQL for data analysis, reporting, and database querying, using practical exercises and real-world datasets.
- Gained full proficiency with SQL fundamentals and intermediate query patterns for analytics workflows.
- Core skills acquired:
- SQL querying with
SELECT,WHERE,ORDER BY, logical filtering, and pattern matching. - Data aggregation with
GROUP BY,HAVING. - Table joins (INNER, LEFT/RIGHT, FULL, CROSS) and multi-table relational operations.
- Database design principles: tables, constraints, data types, and schema structure.
- Practical work using PostgreSQL + pgAdmin, including database setup, table creation, and data loading.
- Integrating SQL with Python as part of data-science pipelines.
- SQL querying with
- SQL for Data Analysis: Advanced SQL Querying Techniques (Click to view certificate)
- Completed an applied advanced-SQL course focused on professional analytical querying and multi-table relational workflows.
- Core skills acquired:
- Advanced JOIN logic, including multi-table joins, self-joins, cross joins, and unions for complex relational analysis.
- Subqueries and Common Table Expressions (CTEs) (including recursive CTEs) to structure multi-step logic and improve readability of analytical pipelines.
- Window functions (
ROW_NUMBER,RANK,DENSE_RANK,LAG,LEAD,FIRST_VALUE,LAST_VALUE) for ranking, rolling calculations, trend detection, and partition-based summarisation. - Comprehensive string, numeric, and date functions, including pattern matching, conditional expressions, and reliable handling of NULLs and irregular values.
- Applied analytical patterns: de-duplication workflows, conditional aggregation, segmentation, pivot-style summaries, and multi-stage data transformations.
- Use of views and modular query structures to encapsulate reusable logic and streamline complex analytical workflows.
Artificial Intelligence / Prompt Engineering
- ChatGPT Prompt Engineering for Developers (Click to view certificate)
- Completed a hands-on introduction to prompt engineering for building LLM-powered applications, taught by Isa Fulford (OpenAI) and Andrew Ng.
- Learned best practices for writing clear, reliable prompts and systematically refining them to improve output quality.
- Gained practical experience using the OpenAI API, including developing a custom chatbot in the course labs.
- Practised prompt engineering across core application areas such as summarisation, inference, text transformation, and generative expansion, developing intuition for how LLMs interpret and follow instructions.
- Built a conceptual understanding of LLM behaviour and how prompting can be used to prototype and deploy new capabilities.
- Core skills acquired:
- Writing concise, structured, instruction-focused prompts.
- Iterative refinement techniques for accuracy, robustness, and alignment.
- Designing prompts for summarisation, classification, transformation, and generation tasks.
- Using LLM APIs in practical development workflows.
- Building basic LLM-driven applications with reproducible prompting strategies.
- Agentic AI by Andrew Ng (Click to view certificate)
- Completed Andrew Ng’s flagship introduction to Agentic AI — systems that can plan, act, reflect, and self-improve through autonomous workflows.
- Learned how agentic systems extend traditional prompting by integrating tools, memory, iterative reasoning, and multi-step planning for complex tasks.
- Built agents using reflection loops, tool calling, retrieval, search, APIs, and multi-agent collaboration, with hands-on code labs reinforcing each pattern.
- Studied key design patterns for real-world agents, including task decomposition, reflection-driven refinement, evaluation loops, and autonomous decision-making.
- Implemented agent workflows in Python, gaining practical intuition for how modern agent frameworks function under the hood.
- Core skills acquired:
- Understanding the Agentic AI architecture: planners, actors, critics, tools, memory, and feedback loops.
- Designing agents for multi-step reasoning, iterative improvement, and self-correction.
- Integrating tool calling and API actions to interact with external systems and retrieve live information.
- Building retrieval-augmented agents using vector databases and structured search.
- Constructing multi-agent systems, including coordinator–specialist patterns.
- Applying evaluation and debugging techniques such as trace inspection, behavioural tests, and performance refinement.
- Awareness of real-world applications of agentic workflows in data science, automation, research, and production AI systems.
- AI Agents in LangGraph (Click to view certificate)
- Completed a practical introduction to building stateful, production-ready AI agents using the LangGraph framework.
- Learned how to structure agent workflows as deterministic graphs with clear control flow, tool calling, memory management, and error handling.
- Developed agents capable of planning, reflecting, and iterating through multi-step tasks, with transparent behavioural tracing and debugging.
- Built complete agent pipelines in Python, gaining intuition for how LangGraph orchestrates LLMs, tools, and state updates.
- Core skills acquired:
- Designing agent workflows using LangGraph’s nodes, edges, and state objects.
- Integrating tool calling, APIs, and memory into multi-step agent systems.
- Implementing reflection loops, retries, and human-in-the-loop stages.
- Debugging and evaluating agents through event tracing and structured testing.
- Understanding how graph-based architectures support reliable, scalable agentic applications.
