Summer Internship at DHAKMA IO (CEO is a CS PhD Alum)

PhD alum and CEO of DHAKMA IO is looking for students who've actually built things on hardware (Pi, Arduino, Jetson, CUDA, that kind of thing) and are excited about computer vision, robotics, autonomous systems, and LLMs. Flexible timing, paid, and part-time works too if the right person comes along

 

AI SYSTEMS INTERN VISION, ROBOTICS & LLMS

We build and deploy real systems in the field — computer vision,
autonomous guidance, robotics, and graphics. We ship products, not
slide decks — vision systems, robotics, and LLM-powered applications.
Summer or part-time works. If you've built something real, keep reading

W H A T  Y O U ' L L  W O R K  O N
DRONE DETECTION & TRACKING
Real-time detection and classification pipelines running on edge
hardware. Raspberry Pi 5 + Hailo AI accelerators. Actual field
deployments.
AUTONOMOUS GUIDANCE & INDUSTRIAL VISION
We've built autonomous guidance stacks from scratch. Also defect
detection, dimensional measurement, and visual inspection
deployed on-prem for oil & gas and manufacturing. No cloud. Ships
in a pelican case.
CUDA & GRAPHICS PROGRAMMING
GPU compute pipelines, shader programming, real-time rendering
and simulation. If you've written CUDA kernels or played with
graphics APIs for fun, you'll fit right in.
LLM DEPLOYMENT & AI APPLICATIONS
Deploying and fine-tuning LLMs on-premise for clients who can't use
OpenAI. Quantization, RAG pipelines, inference optimization on
constrained hardware. Making large models actually run in the real
world.
DHAKMA CORE — SOVEREIGN AI APPLIANCE
Our flagship product. Air-gapped, on-premise AI in a purpose-built
hardware enclosure. You'll work on the full stack from inference
optimization to the Linux services that run it.

W H A T  W E  A C T U A L L Y  N E E D
▶ You've deployed something on a Raspberry Pi, Jetson, or
Arduino and it actually worked — not just in a tutorial, in your
own project
▶ Linux is your daily environment or you've spent real time
developing on it. You're comfortable with the terminal, systemd,
SSH, and troubleshooting hardware over serial
▶ Strong programming fundamentals — C++, Python, Rust,
whatever. Language doesn't matter, ability to write fast, correct
code that talks to hardware does
▶ Hands-on experience with LLMs — running them locally, finetuning,
building RAG pipelines, or optimizing inference.
Knowing your way around Ollama, llama.cpp, or HuggingFace
beyond just calling an API
+ Experience with OpenCV, PyTorch, or ONNX inference on edge
hardware is a real plus
+ Any background in computer vision, robotics, or embedded
systems coursework
+ Anyone who's done actual on-device programming for fun or
projects (graphics, Pi, Arduino, GPU compute) is very welcome

A B O U T  D H A K M A
Dhakma IO is an AI product and productized services company founded by Dr. Tharindu "Mackie" Mathew, a Purdue CS PhD and former
Microsoft Azure engineer. Our products include Dhakma Core — a sovereign, air-gapped private AI appliance — alongside productized vision
and robotics systems for defense, manufacturing, and energy. Everything ships as a finished product, not a project. Every project is hands-on
and US-based. You'll work directly with Mackie — no bureaucracy, no intern busywork. The photos above are actual hardware we've built and
flown.

 

U S  P E R S O N S  P R E F E R R E D — N O  S P O N S O R S H I P  A V A I L A B L E  A T  T H I S  T I M E
SHOW US WHAT YOU'VE BUILT
Send a short note + resume (and a GitHub or project link if you have one) to contact@dhakma.io
dhakma.io — West Lafayette, IN

Last Updated: Mar 25, 2026 8:51 AM