About / UK-wide advisory, North East base
AI adoption and decision intelligence grounded in quantitative evidence.
Dr Mikhail Vasenin is a finance academic, PhD in Quantitative Finance, and applied AI founder. He helps UK organisations turn AI experiments, forecasts, dashboards, and analytical signals into clearer decisions and practical workflows.
The work is useful when teams need to decide what to automate, which evidence to trust, how forecasting should be reviewed, and where human judgement must remain in the loop.
30-45 minute call: map where AI, forecasting, or analytical workflow design can add value, what data exists, and what the first serious test would look like.
Who I work with
- Finance and strategy leaders who need clearer evidence before committing to AI or analytics work.
- Operations and reporting teams whose dashboards, forecasts, or manual workflows are not trusted enough.
- Founders and innovation leads who need a serious route from AI experiments to usable workflows.
- Sustainability, ESG, and research teams that need better signal interpretation.
When to bring me in
- AI adoption is being explored, but priorities are unclear.
- Pilots exist, but the workflow is inconsistent or hard to govern.
- Forecasting and reporting depend too much on manual interpretation.
- Dashboard metrics are visible, but not clearly decision-relevant.
How engagements move
Diagnostic
Clarify the decision problem, evidence base, data readiness, risks, and likely first test.
Design
Shape the workflow, guardrails, assumptions, validation points, and human-in-the-loop responsibilities.
Pilot
Support a focused prototype or operating model without overbuilding before value is understood.
Handover
Document the logic, review cadence, and next actions so the team can keep improving.
Outputs
Tangible deliverables for teams that need clarity before implementation.
AI adoption priorities
Ranked use cases with risk, evidence-quality, and implementation guardrails.
Forecasting discipline
Methods, assumptions, validation checks, and review cycles that make forecasts more accountable.
Decision-support workflow design
Clear boundaries between automated analysis, human judgement, escalation, and action.
Metrics and signal-quality audit
Review of whether dashboards, ESG signals, KPIs, or model outputs are fit for decisions.
Lightweight decision dashboards
Only where dashboard structure clarifies the decision, evidence, and next action.
Credibility
A profile built for rigorous but practical advisory work.
Academic base
Finance academic
Research and teaching foundation in empirical and quantitative finance.
Method depth
PhD in Quantitative Finance
Analytical training across finance, econometrics, and evidence-led methods.
Applied work
Applied AI founder
Founder experience translating analysis into practical systems and workflows.
Market focus
UK and North East base
Local ecosystem credibility with UK-wide advisory delivery.
Core lens
Decision intelligence
Forecasting, reporting, analytics, and human-in-the-loop AI use.
Themes
What each theme means for a client decision.
AI adoption
Decision: Which AI use cases deserve attention, budget, and governance.
Evidence: Workflow friction, data readiness, risk, validation needs, and human oversight points.
Output: AI opportunity map, guardrails, and a practical first-test plan.
Decision intelligence
Decision: How leaders turn reports, models, and uncertainty into better action.
Evidence: Signals, assumptions, forecast review cycles, KPIs, and decision ownership.
Output: Decision-support workflow, forecasting discipline, and review cadence.
Quantitative finance
Decision: How to interpret market events, anomalies, and finance-related evidence.
Evidence: Empirical finance, event-study framing, market data, and structured interpretation.
Output: Market intelligence brief, evidence framework, or analytical review.
Sustainability analytics
Decision: Which ESG or sustainability signals are useful enough to inform strategy.
Evidence: Rating movement, information quality, market response, and sustainability data caveats.
Output: Sustainability signal review, interpretation framework, or briefing note.
Intelligent systems
Decision: What should be automated, what should stay human, and how the workflow should be governed.
Evidence: Current processes, reporting load, model limits, user roles, and validation checkpoints.
Output: Workflow design, prototype direction, and handover-ready operating logic.
Approach
How evidence translates into better business decisions.
Event-study discipline
A way to reason about market, policy, product, or organisational events with clearer assumptions and less narrative drift.
Business impact: better interpretation of change, fewer overreactions, and clearer evidence before action.
Signal quality over surface metrics
A research-backed habit of asking whether a metric, ESG score, dashboard, or model output is decision-relevant.
Business impact: fewer false signals, less dashboard noise, and stronger confidence in what should be acted on.
Human-in-the-loop analytics
AI and automation are treated as workflow support, with judgement, validation, and accountability kept visible.
Business impact: less AI theatre, clearer governance, and more reliable adoption of analytical tools.
Mini case studies
Sanitised examples of work patterns and intended outcomes.
These are public-safe case-style descriptions. They avoid invented metrics and focus on the practical result: better structure, clearer evidence, and stronger decisions.
Business case
AI adoption priority map
A leadership or innovation team has many AI ideas, but no clear way to decide which use cases are worth testing.
Mikhail's angle
Frame AI opportunities around decision value, data readiness, workflow friction, risk, and human oversight.
Intended business effect
A shorter, better-ranked list of AI priorities with clearer next actions and fewer unfocused experiments.
Business case
Forecasting and reporting redesign
A team produces reports or dashboards, but the outputs do not consistently improve planning, forecasting, or action.
Mikhail's angle
Audit the signals, assumptions, review cycle, and decision points that sit around reporting and forecasting work.
Intended business effect
A more disciplined analytical workflow that connects data, interpretation, accountability, and forward-looking decisions.
Business case
Market intelligence briefing
A finance, fintech, or product team needs a sharper view of market signals, digital-asset questions, or event effects.
Mikhail's angle
Use empirical framing, event-study thinking, and structured interpretation rather than loose market commentary.
Intended business effect
Clearer evidence for strategy, product, investment research, or stakeholder briefing decisions.
Business case
Sustainability signal review
An organisation needs to interpret ESG or sustainability-related information without over-relying on headline labels.
Mikhail's angle
Assess signal quality, rating movement, market interpretation, and the link between sustainability data and decisions.
Intended business effect
Better sustainability-related judgement, clearer analytical caveats, and more decision-relevant reporting.
Business case
Evidence quality checkpoint
A team wants to adopt dashboard metrics or AI-generated analysis, but the reliability and decision value are unclear.
Mikhail's angle
Introduce a review point for signal quality, assumptions, validation, and where human judgement must remain visible.
Intended business effect
Fewer false signals, clearer governance, and less risk of adopting metrics or AI outputs that look useful but do not support action.
Research record
Selected publications add evidence depth to the consulting profile.
The publication record supports a serious business-facing position: quantitative finance, digital assets, sustainability information, market response, and event-study thinking.
View all selected publicationsCryptocurrency value and 51% attacks: evidence from event studies
Shanaev, Shuraeva, Vasenin, Kuznetsov
Journal of Alternative Investments / Journal article
Research signal
Event-study discipline for interpreting market shocks and protocol-risk events.
Advisory relevance
Directly supports advisory framing around event effects, market structure, and risk interpretation.
Seasonality in the cross-section of cryptocurrency returns
Long, Zaremba, Demir, Szczygielski, Vasenin
Finance Research Letters / Journal article
Research signal
Cross-sectional return analysis and seasonal signal interpretation in crypto markets.
Advisory relevance
Strengthens the digital-asset and cross-sectional return analysis evidence base.
LGBT CEOs and stock returns: Diagnosing rainbow ceilings and cliffs
Shanaev, Skorochodova, Vasenin
Research in International Business and Finance / Journal article
Research signal
Market-response evidence around leadership signals, governance, and investor interpretation.
Advisory relevance
Demonstrates evidence-led analysis of leadership signals and capital-market interpretation.
When Bitcoin is high: cryptocurrency value, illicit markets and US marijuana bills
Shanaev, Johnson, Vasenin, Panta, Ghimire
Journal of Financial Regulation and Compliance / Journal article
Research signal
Links digital-asset valuation questions with institutional, regulatory, and market context.
Advisory relevance
Connects cryptocurrency valuation questions with institutional and regulatory context.
Start with a focused conversation