About
Intro
- Data-oriented, decoupled and scalable software design to build robust software products and systems.
- Key areas: Data pipelines, AI Integration, 3D Graphics, Simulation, Backend services
- Extensive experience bootstrapping v1 software - from design to product.
Robust Software
Correct, Performant, Simple, Elegant, Extensible.
In other words, quality software! It requires a mindset and a deeper understanding that goes beyond the industry’s current-favorite buzz-words. It demands a disciplined and consistent execution of a set of fundamental principles born out of attention to detail and a desire to improve one’s craft. Writing robust software is an iterative process.
Software Engineering
Software Engineering is the process of taking an idea or an algorithm and converting it into a robust product. It starts with an ability to break down a complex problem into its constituting elements and then arranging and assembling them back in a way - almost Lego style - that allows for swapping or upgrading parts of it without affecting the overall system.
It is important to identify engineering practices which matter, pick the tried-and-true and avoid the distractions. Both in coding and technology selection, the fewer the variations, the cleaner the implementation.
A data-first approach, coupled with a few choice languages, affords a very clean and consistent methodology to analyze and build a system - of any complexity.
Thinking through problems, cracking them down to their core, and then building those solutions - with the right abstractions and tools - is really the essence of engineering.
Product Engineering
Crucial in making a relevant product is the ability to engage with customers in clarifying, communicating and managing expectations. Requirements must be identified, gathered and formalized. Product design and road-map needs to be defined, and - keeping in view the timelines and budget - the software foundation necessary to achieve it must be engineered. Meetings need to be steered towards meaningful conclusions.
Data, 3D Graphics, AI and Simulation
Data is still the king, and now it has a smart vizier - AI. Software design now must embrace utilizing AI effectively to enable data insights! Key advancement in software design is now how to smartly integrate AI to more effectively and seamlessly utilize the data. Data pipeline implies identifying what and how to capture, transmit, collect, process (filter and map), validate, and store (data lake, db, or hybrid storage) - at scale; and AI integration enables the next half of it - insights and visualizations. This enables data-science and visualization teams to use the data effectively thus providing critical business metrics.
3D Graphics are inherently data driven; hence data pipeline understanding directly translates into a better visualization product. This needs to be combined with a fundamental understanding about hardware architecture and the nuances of its software interface.
Simulation ties together a physical process, its data and visualization to provide the most telling story about a physical system.
Books and Links
The following books, authors, and resources have had a profound influence on cultivating software development style and understanding.
- Hackers and Painters by Paul Graham
- Uncle Bob’s coding talks
- The Standard C++ by Bjarne Stroustrup
- Programming Erlang by Joe Armstrong
- Design Patterns Explained by Alan Shalloway
- Linux 3D Graphics Programming by Norman Lin
Technologies
The best code is code that is not written. It is best to avoid complexity than try to overcome it. Latest is not always the greatest! These anecdotes have influenced the choice of technology. Thus the preferred, tried and more familiar choice of technology includes:
- Languages: C++, Python, Dart, Clojure, Haskell, Erlang / Elixir.
- AI: Pytorch, TensorFlow, PyTorch, LangChain, ChromaDB, Ollama
- Data: Apache Beam, Spark, Kafka, Airflow, duckdb, Postgress, Cassandra, Druid, Avro, Parquet, Avro, Iceberg
- Web: Hugo, Htmx, alpine.js, Phoenix, ClojureScript.
- Cloud: Azure, AWS, GCP; Kubernetes, Docker, Terraform
- Google cloud: (Kubernetes, Cloud Composer, DataFlow, DataStore, PubSub, Cloud Storage, BigTable)
- Libraries: Flutter, OpenMP, OpenCascade, D3D, OpenGL, Pandas, Polar
Showcase
- mzmlab, medsightai, audionavai
- 3D Terrain rendering engine
- Financial accounting software
- Geo-fenced eventing platform
- Parallel programming library
- Async file handling for gcs
Experience
- LLM integration in EHR (FHIR) for AI enabled experience
- Fishtank automation: end to end software stack; ESP32 code + mobile app + mqtt cloud service
- Ad-click data ingestion service (Tune)
- 3D in-house modeling software and file format (Stellar Science)
- VR data architecture (Strivr)
- Systematic upgrade of data platform from a monolithic application to a service-oriented event-based platform.
- Identification and alignment of customer needs with company goals.
- Continuous improvement in processes and engineering.
- Thoughtful data pipeline ensuring data reliability, consistency, replay-ability and performance.
- Economical data pipeline through careful choice of technologies at each step.
- Integrated automation at every step of the pipeline; from ingestion to report generation.
- Staged data modeling; upstream more generic, downstream more specialized; datalake as a single source of truth.
- In-house visualizations as well as third party integration through API and LMS systems.
For more information, contact: info@44systems.com