ultralytics 8.0.97
confusion matrix, windows, docs updates (#2511)
Co-authored-by: Yonghye Kwon <developer.0hye@gmail.com> Co-authored-by: Dowon <ks2515@naver.com> Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com> Co-authored-by: Laughing <61612323+Laughing-q@users.noreply.github.com>
This commit is contained in:
@ -1,5 +1,6 @@
|
||||
---
|
||||
comments: true
|
||||
description: Run YOLO models on your Android device for real-time object detection with Ultralytics Android App. Utilizes TensorFlow Lite and hardware delegates.
|
||||
---
|
||||
|
||||
# Ultralytics Android App: Real-time Object Detection with YOLO Models
|
||||
@ -19,7 +20,7 @@ FP16 (or half-precision) quantization converts the model's 32-bit floating-point
|
||||
INT8 (or 8-bit integer) quantization further reduces the model's size and computation requirements by converting its 32-bit floating-point numbers to 8-bit integers. This quantization method can result in a significant speedup, but it may lead to a slight reduction in mean average precision (mAP) due to the lower numerical precision.
|
||||
|
||||
!!! tip "mAP Reduction in INT8 Models"
|
||||
|
||||
|
||||
The reduced numerical precision in INT8 models can lead to some loss of information during the quantization process, which may result in a slight decrease in mAP. However, this trade-off is often acceptable considering the substantial performance gains offered by INT8 quantization.
|
||||
|
||||
## Delegates and Performance Variability
|
||||
@ -61,4 +62,4 @@ To get started with the Ultralytics Android App, follow these steps:
|
||||
|
||||
6. Explore the app's settings to adjust the detection threshold, enable or disable specific object classes, and more.
|
||||
|
||||
With the Ultralytics Android App, you now have the power of real-time object detection using YOLO models right at your fingertips. Enjoy exploring the app's features and optimizing its settings to suit your specific use cases.
|
||||
With the Ultralytics Android App, you now have the power of real-time object detection using YOLO models right at your fingertips. Enjoy exploring the app's features and optimizing its settings to suit your specific use cases.
|
@ -1,5 +1,6 @@
|
||||
---
|
||||
comments: true
|
||||
description: Experience the power of YOLOv5 and YOLOv8 models with Ultralytics HUB app. Download from Google Play and App Store now.
|
||||
---
|
||||
|
||||
# Ultralytics HUB App
|
||||
|
@ -1,5 +1,6 @@
|
||||
---
|
||||
comments: true
|
||||
description: Get started with the Ultralytics iOS app and run YOLO models in real-time for object detection on your iPhone or iPad with the Apple Neural Engine.
|
||||
---
|
||||
|
||||
# Ultralytics iOS App: Real-time Object Detection with YOLO Models
|
||||
@ -33,7 +34,6 @@ By combining quantized YOLO models with the Apple Neural Engine, the Ultralytics
|
||||
| 2021 | [iPhone 13](https://en.wikipedia.org/wiki/IPhone_13) | [A15 Bionic](https://en.wikipedia.org/wiki/Apple_A15) | 5 nm | 15.8 |
|
||||
| 2022 | [iPhone 14](https://en.wikipedia.org/wiki/IPhone_14) | [A16 Bionic](https://en.wikipedia.org/wiki/Apple_A16) | 4 nm | 17.0 |
|
||||
|
||||
|
||||
Please note that this list only includes iPhone models from 2017 onwards, and the ANE TOPs values are approximate.
|
||||
|
||||
## Getting Started with the Ultralytics iOS App
|
||||
@ -52,4 +52,4 @@ To get started with the Ultralytics iOS App, follow these steps:
|
||||
|
||||
6. Explore the app's settings to adjust the detection threshold, enable or disable specific object classes, and more.
|
||||
|
||||
With the Ultralytics iOS App, you can now leverage the power of YOLO models for real-time object detection on your iPhone or iPad, powered by the Apple Neural Engine and optimized with FP16 or INT8 quantization.
|
||||
With the Ultralytics iOS App, you can now leverage the power of YOLO models for real-time object detection on your iPhone or iPad, powered by the Apple Neural Engine and optimized with FP16 or INT8 quantization.
|
@ -1,5 +1,6 @@
|
||||
---
|
||||
comments: true
|
||||
description: Upload custom datasets to Ultralytics HUB for YOLOv5 and YOLOv8 models. Follow YAML structure, zip and upload. Scan & train new models.
|
||||
---
|
||||
|
||||
# HUB Datasets
|
||||
@ -46,4 +47,4 @@ names:
|
||||
After zipping your dataset, sign in to [Ultralytics HUB](https://bit.ly/ultralytics_hub) and click the Datasets tab.
|
||||
Click 'Upload Dataset' to upload, scan and visualize your new dataset before training new YOLOv5 or YOLOv8 models on it!
|
||||
|
||||
<img width="100%" alt="HUB Dataset Upload" src="https://user-images.githubusercontent.com/26833433/216763338-9a8812c8-a4e5-4362-8102-40dad7818396.png">
|
||||
<img width="100%" alt="HUB Dataset Upload" src="https://user-images.githubusercontent.com/26833433/216763338-9a8812c8-a4e5-4362-8102-40dad7818396.png">
|
@ -1,5 +1,6 @@
|
||||
---
|
||||
comments: true
|
||||
description: 'Ultralytics HUB: Train & deploy YOLO models from one spot! Use drag-and-drop interface with templates & pre-training models. Check quickstart, datasets, and more.'
|
||||
---
|
||||
|
||||
# Ultralytics HUB
|
||||
@ -20,7 +21,6 @@ comments: true
|
||||
launch [Ultralytics HUB](https://bit.ly/ultralytics_hub), a new web tool for training and deploying all your YOLOv5 and YOLOv8 🚀
|
||||
models from one spot!
|
||||
|
||||
|
||||
## Introduction
|
||||
|
||||
HUB is designed to be user-friendly and intuitive, with a drag-and-drop interface that allows users to
|
||||
|
@ -6,7 +6,6 @@ comments: true
|
||||
|
||||
This page is currently under construction!️ 👷Please check back later for updates. 😃🔜
|
||||
|
||||
|
||||
# YOLO Inference API
|
||||
|
||||
The YOLO Inference API allows you to access the YOLOv8 object detection capabilities via a RESTful API. This enables you to run object detection on images without the need to install and set up the YOLOv8 environment locally.
|
||||
@ -45,7 +44,6 @@ print(response.json())
|
||||
|
||||
In this example, replace `API_KEY` with your actual API key, `MODEL_ID` with the desired model ID, and `path/to/image.jpg` with the path to the image you want to analyze.
|
||||
|
||||
|
||||
## Example Usage with CLI
|
||||
|
||||
You can use the YOLO Inference API with the command-line interface (CLI) by utilizing the `curl` command. Replace `API_KEY` with your actual API key, `MODEL_ID` with the desired model ID, and `image.jpg` with the path to the image you want to analyze:
|
||||
@ -334,7 +332,6 @@ YOLO segmentation models, such as `yolov8n-seg.pt`, can return JSON responses fr
|
||||
}
|
||||
```
|
||||
|
||||
|
||||
### Pose Model Format
|
||||
|
||||
YOLO pose models, such as `yolov8n-pose.pt`, can return JSON responses from local inference, CLI API inference, and Python API inference. All of these methods produce the same JSON response format.
|
||||
|
@ -1,5 +1,6 @@
|
||||
---
|
||||
comments: true
|
||||
description: Train and Deploy your Model to 13 different formats, including TensorFlow, ONNX, OpenVINO, CoreML, Paddle or directly on Mobile.
|
||||
---
|
||||
|
||||
# HUB Models
|
||||
@ -11,7 +12,6 @@ Connect to the Ultralytics HUB notebook and use your model API key to begin trai
|
||||
<a href="https://colab.research.google.com/github/ultralytics/hub/blob/master/hub.ipynb" target="_blank">
|
||||
<img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"></a>
|
||||
|
||||
|
||||
## Deploy to Real World
|
||||
|
||||
Export your model to 13 different formats, including TensorFlow, ONNX, OpenVINO, CoreML, Paddle and many others. Run
|
||||
|
Reference in New Issue
Block a user