Lobe is an easy-to-use free tool to help you start working with machine learning.
This project was created to help you bootstrap your Lobe project on iOS. Built with SwiftUI for Apple's iOS and iPadOS platforms.
In the next few sections we’ll take you through the basics of creating your new project and getting started. At a high level, we’ll go over:
- Installing your Development Environment
- Exporting your model from Lobe and integrating it into the code
- Deploying your app on your device
- Tips and Tricks for creating your own custom version of this app
- Contributing
In this stage we’re going to get you setup so you can build, launch, and play with your app. These instructions are written for macOS, the only system you can develop iOS apps on.
To start, we’re going to download ("clone") this repository.
If you already have git
installed and know how to clone this repo, skip to Step 2.
If you prefer to use the GitHub Desktop app, click on the "Code" button above and click "Open with GitHub Desktop":
Otherwise, we need to install a few things:
First, open a Terminal window.
Next, copy & paste the following into a Terminal window and hit return.
/bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install.sh)"
brew doctor
brew install git
Now that we have installed git
, you can clone this repo with the following command. You'll want to navigate to a folder in Terminal where you'd like to store these files. If you need help, here's a gentle introduction to navigation in the terminal.
git clone https://github.com/lobe/iOS-bootstrap.git
Step 2 - Installing Xcode
Next, we're going to install Xcode, a free tool from Apple, via the App Store. This is a fairly straightforward process that could take an hour or more, as the Xcode app is pretty large.
Once it's done, double click on the Lobe_iOS.xcodeproj
file in your project directory and it'll open in Xcode!
Now we need to export your custom model from Lobe. If you'd like, you can skip to the deploying your app section if you just want to see this app working with the default sample model.
Once you've trained a custom model in Lobe, you can drop it into your app.
First, let's open your project in Lobe and export it by pressing ⌘E
and selecting CoreML.
Once you have the CoreML model, rename it to LobeModel.mlmodel
and drag it into the root of this repo to replace the exisiting sample model:
And we're done! Next let's get it on your phone so you can see it work live.
Next, we'll want to get this app onto your phone so you can see it working live with your device's camera. To do this, plug in your device via a USB-Lightning cable and, in the open Xcode window, press the play button in the top left corner of the window:
And there you have it! You're app should be running on your device. If Xcode pops up a message asking you to setup your team, just follow the steps it suggests or take a look here. And finally, if you'd like to post your app (running your custom image classification model) to the App Store, you're more than welcome to do so. Follow the instructions here to get the process rolling. You'll need to have an Apple Developer account.
This app is meant as a starting place for your own project. Below is a high level overview of the project to get you started. Like any good bootstrap app, this project has been kept intentionally simple. There are only two main components in two files, ContentView.swift
and MyViewController.swift
.
This file contains all the main UI, built using SwiftUI. If you'd like to adjust the placement of any UI elements or add you own, start here. If you'd like a primer on SwiftUI, start with this: Build a SwiftUI app for iOS 14
This file contains all parts that needed to be done using the old style UIKit. Mainly this is making the camera view. Luckily, this is all ported back to SwiftUI using Apple's UIViewControllerRepresentable
API. This allows us to make the camera view, and then use it like any other SwiftUI view above. You'll also see the CoreML prediction call here.
Includes the small amount of SwiftUI for the prediction bar at the bottom of the screen.
- This project contains a sample icon and other assets, feel free to use these or create your own.
- When you're using the app, swiping up on the screen pulls open the image picker.
- Double tapping flips the camera around to the front facing camera. Double tapping again flips the camera back to the front.
If you can think of anything you'd like to add, or bugs you find, please reach out! PRs will be openly accepted (if they keep project simple, bonus points for making it even simplier) and issues will be triaged.
We look forward to seeing the awesome projects you put out there into the world! Cheers!
– The Lobe Team