• Resume
  • Calendly
  • Mail
  • Text
  • Phone
× Send
Hi, I’m Max

I drive product success I offer strategic guidance I craft iOS apps I enhance user experiences I turn ideas to reality I produce engaging podcasts I design captivating visuals I implement IoT solutions I establish brand identity I create responsive websites I craft persuasive copy I uncover data insights I drive marketing initiatives I lead successful projects I design intuitive interfaces I create immersive experiences I leverage data-driven decisions I captivate audiences I foster growth opportunities I drive revenue growth

I am a software engineer currently maintaining two developer projects; that being BoltAThread and Handyapp. Handyapp is an AI-driven home repair platform based in the United Kingdom, while BoltAThread is a companion tool for social media platform, Threads. It’s designed to enhance the UX of the platform. On the side, I enjoy writing and talking about philosophy, mental health, technology and all things life. My greatest passion is to harness the power of technology to help people improve their health and wellness; enabling them to lead more fulfilling lives.

I graduated Summa Cum Laude from the University of Cincinnati with a master’s degree in information technology and a bachelor’s degree in communication design and computer science. In school, my main academic interests were human-computer interaction, digital transformation, and iOS Development. One of the highlights of my university experience was getting the opportunity to participate at the 1819 Innovation Accelerator. I was the product manager for  Yummr, an AR food menu platform, in the fall of 2022. 

Throughout high school and college I worked on many side projects which can be found below. The most prominent of these projects was Nebula, a physical tool that aids in turning physical objects into digital twins, which I theorized in 2022, during grad school.

Past employer experiences, I have worked directly with the Government, specifically the Intelligence Community, on UI upgrades to software maintenance on Nuclear Submarines. Have worked in a variety of startups ranging in healthcare to construction. All under the guise of innovative software projects. If you’d like to chat, my availability can be found on my calendar

Featured Projects

Oh hi there 👋 It’s nice to meet you.

Sign up to receive awesome content in your inbox, every month.

I don’t spam! Only valuable tid bits to bring you forward in daily life!


Inspired by my Master’s capstone on designing a photogrammetry platform for business owners, Nebula is a hardware-focused lightbox designed for creating digital twins with ease. It features collapsible side walls, a built-in turntable, and great storage for accessories. The Nebula Lightbox has controls on the side and remote controls for further customization via the iPhone remote app.

  • Lightbox Remote app enables owners to control the brightness and turntable speed all within the app
  • Uses CoreBluetooth for connectivity/disconnection

Pre-Flight Inspection

Pre-flight helicopter inspections can benefit from the use of modern AR and IoT technologies. In this simulation, the pilot simply points the iPad at the helicopter, and the app shows the steps for performing and capturing inspection data. Leveraging a physical CAD model of the Blackhawk helicopter, ILW developed an inspection checklist application to be used by a pilot to perform a pre-flight inspection. This tool captures data using an iPad and AR software, which can write back to and pull data from IoT software (bi-directional). This experience was for the US Army. This was built on Vuforia Studio to allow pilots to view the experience. Data was being pulled using ThingWorx to simulate APU levels on the gas tank reading to ensure levels are in satisfactory range for pilot. Created animations utilizing Creo Illustrate.


An iOS app that I built from the bottom up to aid productivity for construction workers. It was inspired by the traditional layout for iOS and features “mini-apps” or modules within the app itself. This demo raised $1M+ in funding. Snakey features include:

  • Schedule view to see your jobs for the day
  • Map view to see where the equipment is located
  • Query and find a designated equipment item
  • Locate module that integrates the Nearby Framework to find the precise location of the given equipment based on UWB


Yummr is a combination of Yum + Mixed Reality.  The mission is to bring joy to everyone’s day by creating an immersive and fun AR experience. By visualizing food in augmented reality, we are one step closer to enhancing our senses with technology – with the one thing that unites us all, FOOD.

Yummr is a mobile platform built on a photogrammetry pipeline to turn food dishes into 3D models to allow customers to see before they eat.  

As a person who loves food and is indecisive on what to eat, I tend to find myself pulling up the app, Yelp, to help with the deciding factor on what meal item to pick. Often, the photos taken are from weird angles, terrible lighting, some are half-eaten dishes, and are just hard to find. I would like to be able to easily find what a menu item looks like 1 to 1 from the exact dish.

What started through months and months of research on figuring out a solution to this problem, resulted in utilizing the domain I have expertise in – that being augmented reality. I decided to create an augmented reality app that allows you to view 3D food models.

I experimented with 3D modeling software and worked towards creating a working prototype that allowed one to view a simple one-page menu of AR items. After receiving numerous market validation from restaurants, I wanted to take it one step further to do multiple restaurants with AR items. So I created sample data and did just that – created a layout. The challenge that came next – was creating authentic 3D data. So, I did what any individual would do and gave up on the idea as it was too hard to do. Just kidding— I went full gung-ho and threw myself into learning Python, Bash, Swift, and other languages to get this concept of a 3D model made with ease.

I ended up perfecting it. I created a methodology for users to be able to turn images and videos into 3D models with simple ease all within my app. The process of start to finish for the data was the biggest challenge. That process ended up inspiring my Masters Thesis on building 3D models to enhance the customer experience for businesses. The entirety of the 3D capture methodology was revamped within the paper. 

The idea for Yummr was around people who love food and just want to know what they are getting before they decide on what it is they want. I expanded this concept further and leveraged the prototype I had built to recruit an amazing team of engineers to flesh this out.

From there, we ended up being invited to the 1819 Accelerator to pitch our business and was awarded $5000 to invest back into the business. 

Some background context – I am an entrepreneur with 8+ years of experience in the realm of starting projects from idea to completion. I decided to take this concept and put a twist on it. Have a project-based focus rather business-based focus. This allowed us to have fun and give the culture to the people of the company – allowing people in Yummr to choose their destiny in food.

Problems Restaurant Owners Face

Restaurant owners open restaurants because they love food and cooking.  It is extremely difficult to have a successful restaurant due to the fact that margins are thin and most close within the first two years.  While they are the masters of their craft, they are not always IT professionals.  That is where we come in:  Our goal is to help them with technology and take that concern off their… plate.

Scanner Feature

At our core, we have a scanner that allows restaurant owners to create 3D models by taking photos, with their smartphone, at every angle.  Our app will bring photogrammetry to the masses with our easy to use scanner.  


Photogrammetry is the process of stitching photos together to create a 3D model that can be viewed/interacted with in multiple ways.  The file type that IOS uses is .usdz and websites use .glb

Viewing in AR/VR

The main way is by using their smartphone’s camera to place the object on a surface in front of them ( augmented reality).  Another way is by viewing the model on their phone’s screen.  Unlike a still photo, the user can spin or zoom into the object.  Lastly, through virtual reality and loading our food assets into a virtual world like Oculus Rift.

Database of Hyper Realistic Food Models

We are going to collect a large database of hyper realistic food models.  The quality of the food models is key because we are competing with HD photos and fake food in commercials.  It is a complicated process to go from the inputs of photos to life like models.  It requires preprocessing and editing the input photos of the different components of the 3D model.  It is a time consuming task so we plan to automate the pipeline from start to finish and use machine learning for the photoshopping.  The last step is touching up the final model in post processing.  

Note: A 3d model is composed of the mesh map, which gives it shape. And a texture map that overlays onto the mesh map.  

Restaurant and Menu Directory
  1. Best and most up-to-date restaurant and menu information – the go-to app for restaurants to manage their menu.  Make it easy by only doing it in one place.
    1. Make real time changes to the menu
      1. Mark unavailable items – if it is out of stock due to ingredient availability.  Supply chain issues are tough lately.
      2. Or add/modify daily specials
    2. Edit menu and information in one place and be changed across all platforms
      1. Their own website or app
      2. Google My Business ( Google Maps) (Apple Maps)
      3. Directory Sites (Yelp, Trip Advisor)
      4. Delivery Apps (Uber Eats, Postmates, DoorDash)
      5. POS systems ( Toast, Square, Clover, Aloha)
  2. Search and recommendation system of restaurants near you

Since we have a large database of information and models.  Other companies can use our API to load the assets into their app or even in a VR game they are making.  Imagine our assets on the popular apps like Uber Eats, DoorDash, or even Instagram!  

Other features:
  1. QR Codes – customers can scan QR codes to load the restaurant menu. QR code adoption has increased a ton due to COVID.
  2. Team AR feature – While in AR mode, you can move your camera around to see what other people at your table are viewing at the same time that your model is right in front of you.  Imagine being able to see all the food on the entire team before you feast!
  3. Online Ordering – Order food for in store pick up or dine in through our app.  We do not plan on implementing this soon but it is an option.  We want to focus on our core rather than this market since there are many large competitors.  
One important distinction is that the owner has access to more features than a regular customer TabViewControllers:
  1. HomeViewController:
    1. Search Bar
    2. Tableview of nearby restaurants or search results
  2. MyMenuViewController(For Owner only):
  3. Dashboard Tab (tableview of some options)
    1. My Restaurant ( Edit information about restaurant)
    2. For Owner only: Messages (After the owner submits photos, we have to process them in the cloud.  Once a model is created and touched up.  Our team admins will look at the model and either approve it or deny it.  The messages controller will give the owner updates about the models. (Whether or not they have been approved or denied)
    3. Account Settings
RestaurantViewController:       Tableview that lists all the menu items.  There is a collection view in the first section of the tableview that lists all the headers of the menu, such as appetizers, salads, or burgers.  The owner can customize this.        Each row has a thumbnail photo, name, description, and price.  You may click on the photo or view in AR button to pull up the item detail screen ItemDetailViewController:  UIView containing editable rows as well as same XIB to be used for adding new food items for restaurant owner view AddFoodItemViewController:  Owners have a button on my Menu and it allows them to create a new food item and open of the camera scanner Manager Folder has many managers that do different things like interface with firebase Backend and Authentication: We use firebase to handle authentication and firebase firestore for our noSQL database.  It has a few collections: 
  1. Restaurant- information such as name, description, address, website, hours, and some settings like their genre, etc. 
  2. MenuItems- information about each menu item such as name, price, file url of model, etc. 
  3. Users-information about the users, such as name, if they are an owner, email, etc.   
  4. Owners-collection of all the owner accounts.  Every owner is also in the user collection but since firebase authentication does not separate account types, we have to do it manually 
  5. Favorites- arrays of users favorite restaurants and menu items
Firebase Cloud Storage: The images and models are stored here and we reference their URLs in Firestore. We use AWS S3 specifically to upload the scanned photos. Then process the models and put the final product onto firebase cloud storage.  S3 allowed us to upload multiple files at a time. 
 Frameworks: UIKit, SwiftUI, ARKit, SceneKit, AVKit, Focus Entity, Firebase, AWS S3, Object Capture. Over 85,000 lines of code produced to create this behemoth. 

Yummr was graciously invited to be a participant in the 1819 Accelerator Cohort.  This turned the project into a business and propelled us further. We solidified our Problem, Solution, Market Validation, Market Size as well as what’s to come next. Watch video below to see the inner workings