About
Intro
- Data-oriented and scalable / parallel software design in Data pipelines, 3D Graphics and Simulation to build robust software products and systems.
- Extensive experience bootstrapping v1 software - from design to product.
Robust Software
Correct, Performant, Simple, Elegant, Extensible.
In other words, quality software! It requires a mindset and a deeper understanding that goes beyond the industry’s current-favorite buzz-words. It demands a disciplined and consistent execution of a set of fundamental principles born out of attention to detail and a desire to improve one’s craft. Writing robust software is an iterative process.
Software Engineering
Software Engineering is the process of taking an idea or an algorithm and converting it into a robust product. It starts with an ability to break down a complex problem into its constituting elements and then arranging and assembling them back in a way - almost Lego style - that allows for swapping or upgrading parts of it without affecting the overall system.
It is important to identify engineering practices which matter, pick the tried-and-true and avoid the distractions. Both in coding and technology selection, the fewer the variations, the cleaner the implementation.
A data-first approach, coupled with a few choice languages, affords a very clean and consistent methodology to analyze and build a system - of any complexity.
Thinking through problems, cracking them down to their core, and then building those solutions - with the right abstractions and tools - is really the essence of engineering.
Product Engineering
Crucial in making a relevant product is the ability to engage with customers in clarifying, communicating and managing expectations. Requirements must be identified, gathered and formalized. Product design and road-map needs to be defined, and - keeping in view the timelines and budget - the software foundation necessary to achieve it must be engineered. Meetings need to be steered towards meaningful conclusions.
Data, 3D Graphics and Simulation
Solving the data problem means identifying what and how to capture, transmit, collect, process (filter and map), validate, and store (db or custom storage) - at scale; thus enabling data-science and visualization teams to use the data.
3D Graphics are inherently data driven; hence data pipeline understanding directly translates into a better visualization product. This needs to be combined with a fundamental understanding about hardware architecture and the nuances of its software interface.
Simulation ties together a physical process, its data and visualization to provide the most telling story about a physical system.
Books and Links
The following books, authors, and resources have had a profound influence on cultivating software development style and understanding.
- Hackers and Painters by Paul Graham
- Uncle Bob’s coding talks
- The Standard C++ by Bjarne Stroustrup
- Programming Erlang by Joe Armstrong
- Design Patterns Explained by Alan Shalloway
- Linux 3D Graphics Programming by Norman Lin
Tech Stack
Preferred, tried, and more familiar choice of technology includes:
- C++, Python, Assembly, Dart, Clojure, Haskell, Erlang, Elixir | GCP, Azure, Kubernetes, Docker | devops, ci/cd, automation.
- Google cloud: (Kubernetes, Cloud Composer, DataFlow, DataStore, PubSub, Postgres, Cloud Storage)
- Parallel and Async programming, OpenMP | OpenCascade, D3D, OpenGL, OpenSCAD
Showcase
- 3D Terrain rendering engine
- Financial accounting software
- Geo-fenced eventing platform
- Parallel programming library
- Async file handling for gcs
Experience
- LLM integration in EHR (FHIR) for AI enabled experience
- ESP32 code + app + mqtt cloud service
- Microsoft (multimedia, mpeg decoder)
- Amazon (metrics collection and aggregation)
- Tune (ad-click data ingestion service)
- Stellar Science (3D modeling software and file format)
- SimAuthor (flight simulation, real-time video + data capture and 3D graphics)
- iStreamPlanet (realtime video encoding platform)
- Strivr (VR data architecture)
- Load balancer implementation; Establishment of dockerized app development and templated CI/CD pipelines.
- Critical bugs resolution for data consistency and correctness.
- Chalking out the itemized pragmatic path to evolve the data platform.
- Design of the event data format; the processing tech (DataFlow) and the storage sub-system (Data Lake).
- Identification and automation of work-flow processes via Apache Airflow.
- End-to-End data pipeline setup; stages, their roles and interfaces.
- Parallelization of GCS file-management; 20 million move file ops from 1 machine in one day.
- Learning Record Systems (LRS) integration and push service.
For more information, contact: info@44systems.com