Seamlessly integrated intelligent systems for real-world operations.
We design and implement intelligent systems that hinge upon practical automation, data workflows, and custom tools that save time, reduce errors, and make teams faster — without the fluff.
Services
What we generally deliver. Open to other projects subject to review
AI copilots for workflows
Draft emails, summarize docs, qualify leads, generate reports — with guardrails and auditability.
Process automation
Eliminate repetitive admin with APIs, webhooks, and integrations across your stack.
Dashboards & intelligence
Track KPIs, performance, and bottlenecks with clean dashboards and reliable data feeds.
Rapid prototypes (1–2 weeks)
We build proof-of-concepts fast: working UI, real data, and a clear path to production.
Deployment & handover
Ship it properly: environment setup, logging, docs, and training so your team can run it.
Portfolio
Selected systems and prototypes.
LeadVizion (B2B lead intelligence)
Scrapes, qualifies, and structures dealership leads into operational logs with repeatable rules.
SmartNudge (slide intelligence)
Prototype: analyze decks, extract structure, and generate guidance for consultants and teams.
How we work (scope → build → ship)
- Discovery (45–90 mins): goals, constraints, data, risk.
- Blueprint: architecture, timeline, acceptance criteria.
- Build: short iterations with demos & feedback.
- Ship: deployment, docs, handover, optional support.
Labs (R&D)
Experiments and prototypes that show range.
MotionSynthSystem (Prototype, 2022)
Gesture-controlled music system: Kinect → TouchDesigner → OSC → Max for Live → Ableton.
Read the 1-page case study
Goal: translate movement in space into musical control for performance/installations.
Signal flow: Kinect v2 → TouchDesigner (Kinect CHOP → Select → Math) → OSC over UDP (port 5000) → Max for Live mapping → Ableton MIDI instruments.
- Two-hand control: left/right hands drive separate musical parts (harmony-ready).
- Performer guidance grid: spatial zones mapped to notes/behaviours.
- Stable routing via OSC addresses /52–/55 for left/right X/Y.
- Multi-person tolerance (max players set to avoid interference).
V2 roadmap: calibration UI, packaging/launcher, smoother tracking & presets, optional ML gesture recognition.
Read the 1-page case study
MotionSynthSystem (Prototype, 2022)
Real-time gesture-controlled music system using Kinect, TouchDesigner, OSC, Max for Live & Ableton. Built to explore how full-body movement can control harmony and performance parameters with low-latency, expressive mapping.
The problem
Traditional MIDI controllers can feel restrictive for performance and installations. I wanted a system where movement in space becomes a controllable instrument — intuitive for the performer and legible to the audience.
The solution
A modular pipeline that tracks body position, extracts hand coordinates, and routes normalized control signals into Ableton instruments.
Signal flow
Kinect v2 → TouchDesigner (Kinect CHOP → Select → Math) → OSC over UDP (port 5000) → Max for Live (OSC route & scaling) → Ableton MIDI tracks/instruments
Key features
- Two-hand control: left and right hands independently drive different musical parts (enabling harmony).
- Performer guidance grid: gridded overlay to help map zones in space to notes/behaviours (“4 boxes to a note”).
- Stable routing: hand axes routed as OSC addresses /52–/55 (left X/Y, right X/Y) for clean mapping.
- Multi-person tolerance: tracking configured to avoid interference when others are in the room.
Technical notes (what’s actually happening)
- TouchDesigner extracts world-space hand positions, then normalizes ranges (initially -1 to 1).
- Data is transmitted via OSC over UDP to Max, then mapped/scaled into usable control values.
- Ableton is configured with paired MIDI tracks per hand so each hand can control separate instruments.
Outcome
A working prototype demonstrating real-time motion-to-sound mapping with reliable routing and a repeatable setup process.
What I’d improve in v2 (production-ready)
- Calibration UI (range & sensitivity presets per space/performer)
- Packaging / “one-click run” launcher & better fault tolerance
- Smoother tracking, latency tuning, and mapping presets
- Optional ML gesture recognition for discrete gestures (e.g., “raise arm → trigger mode”)
Demo (embedded)
Artifacts
TouchDesigner (.toe), Ableton (.als), Max patch (.maxpat) available on request.
Screenshot plan
- Logo/title slide (clean intro)
- Folder view (shows tidy packaging & files)
- TouchDesigner Kinect view (grid & tracking & node network)
- TouchDesigner parameter selection / OSC out (the “plumbing”)
- Ableton view showing Max for Live device / routing (proof it’s driving MIDI)
- Physical setup photo (Kinect on stand & laptop & performer)






What we can build from this
Interactive systems for creative studios, installations, and product demos — built to be reliable in the real world.
Example deliverables
- Calibration & presets for different rooms/performers
- One-click run & auto-recovery (fault tolerant)
- Clean mapping UI (no patching required by end users)
- Optional gesture classification (ML) for discrete triggers
About
Founder
Sean Jibowu — Postgraduate Diploma in Artificial Intelligence, Computer Science. Builds automation and AI-enabled tools that turn messy workflows into repeatable organized systems. Background across product prototypes, real-time systems, and operational tooling.
Shipping practical tooling in lead-gen & enrichment, dashboards, LLM-assisted workflows, and internal apps that save team time, reduce errors, and drive training processes.
Worked in UK motor finance operations. Built tooling around dealer outreach, lead qualification and FCA compliance, and proposal handling.
Understands regulated sales environments, process discipline, and Information Technology.
Expert in bridging the gap between business and engineering: clear requirements, fast delivery, no hand-waving.
Principles
We optimise for real outcomes: clarity, reliability, and systems your team can actually use.
Contact
Send a message and we’ll reply with next steps.
Message
What to include
To get an accurate quote, include:
- What you do today (current workflow)
- Tools you use (Sheets, CRMs, inboxes, etc.)
- Volume (per day/week)
- Constraints (compliance, budget, deadline)
- What success looks like