This repo contains DepthProNormalizedInverseDepthPruned10QuantizedLinear.mlpackage(745 MB).
Normaized Inverse Depth
means that the model will output values between $[0,1]$ where 1 is the closest pixel to the camera, and 0 is the furthest pixel from the camera.
This model was first pruned to 10% sparsity, then the weights were linearly quantized.
DepthPro CoreML Models
DepthPro is a monocular depth estimation model. This means that it is trained to predict depth on a single image.
Model Inputs and Outputs
Inputs
image
: $1536 \times 1536$ 3 color image ($[1 \times 3 \times 1536 \times 1536]$ ImageType).
Outputs
normalizedInverseDepth
1536x1536 monochrome image ($[1 \times 1 \times 1536 \times 1536]$ ImageType).
Download
Install huggingface-cli
brew install huggingface-cli
To download:
huggingface-cli download \
--local-dir models --local-dir-use-symlinks False \
coreml-projects/DepthPro-coreml-normalized-inverse-depth-pruned-10-quantized-linear \
--include "DepthProNormalizedInverseDepthPruned10QuantizedLinear.mlpackage/*""
To download everything, skip the --include
argument.
Conversion Tutorial
The huggingface/coreml-examples
repository contains sample conversion code for DepthProNormalizedInverseDepthPruned10QuantizedLinear.mlpackage
and other models.
Swift Integration
The huggingface/coreml-examples
repository contains sample Swift code for DepthProNormalizedInverseDepthPruned10QuantizedLinear.mlpackage
and other models. See the instructions there to build the demo app, which shows how to use the model in your own Swift apps.
- Downloads last month
- 0
Model tree for coreml-projects/DepthPro-coreml-normalized-inverse-depth-pruned-10-quantized-linear
Base model
apple/DepthPro