ultralytics 8.0.136
refactor and simplify package (#3748)
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com> Co-authored-by: Glenn Jocher <glenn.jocher@ultralytics.com>
This commit is contained in:
@ -42,7 +42,7 @@ To perform object detection on an image, use the `predict` method as shown below
|
||||
|
||||
```python
|
||||
from ultralytics import FastSAM
|
||||
from ultralytics.yolo.fastsam import FastSAMPrompt
|
||||
from ultralytics.models.fastsam import FastSAMPrompt
|
||||
|
||||
# Define image path and inference device
|
||||
IMAGE_PATH = 'ultralytics/assets/bus.jpg'
|
||||
|
@ -6,9 +6,9 @@ keywords: MobileSAM, Faster Segment Anything, Segment Anything, Segment Anything
|
||||
|
||||

|
||||
|
||||
# Faster Segment Anything (MobileSAM)
|
||||
# Mobile Segment Anything (MobileSAM)
|
||||
|
||||
The MobileSAM paper is now available on [ResearchGate](https://www.researchgate.net/publication/371851844_Faster_Segment_Anything_Towards_Lightweight_SAM_for_Mobile_Applications) and [arXiv](https://arxiv.org/pdf/2306.14289.pdf). The most recent version will initially appear on ResearchGate due to the delayed content update on arXiv.
|
||||
The MobileSAM paper is now available on [arXiv](https://arxiv.org/pdf/2306.14289.pdf).
|
||||
|
||||
A demonstration of MobileSAM running on a CPU can be accessed at this [demo link](https://huggingface.co/spaces/dhkim2810/MobileSAM). The performance on a Mac i5 CPU takes approximately 3 seconds. On the Hugging Face demo, the interface and lower-performance CPUs contribute to a slower response, but it continues to function effectively.
|
||||
|
||||
|
@ -88,7 +88,7 @@ The Segment Anything Model can be employed for a multitude of downstream tasks t
|
||||
=== "Prompt inference"
|
||||
|
||||
```python
|
||||
from ultralytics.vit.sam import Predictor as SAMPredictor
|
||||
from ultralytics.models.sam import Predictor as SAMPredictor
|
||||
|
||||
# Create SAMPredictor
|
||||
overrides = dict(conf=0.25, task='segment', mode='predict', imgsz=1024, model="mobile_sam.pt")
|
||||
@ -108,7 +108,7 @@ The Segment Anything Model can be employed for a multitude of downstream tasks t
|
||||
=== "Segment everything"
|
||||
|
||||
```python
|
||||
from ultralytics.vit.sam import Predictor as SAMPredictor
|
||||
from ultralytics.models.sam import Predictor as SAMPredictor
|
||||
|
||||
# Create SAMPredictor
|
||||
overrides = dict(conf=0.25, task='segment', mode='predict', imgsz=1024, model="mobile_sam.pt")
|
||||
@ -119,7 +119,7 @@ The Segment Anything Model can be employed for a multitude of downstream tasks t
|
||||
|
||||
```
|
||||
|
||||
- More additional args for `Segment everything` see [`Predictor/generate` Reference](../reference/vit/sam/predict.md).
|
||||
- More additional args for `Segment everything` see [`Predictor/generate` Reference](../reference/models/sam/predict.md).
|
||||
|
||||
## Available Models and Supported Tasks
|
||||
|
||||
@ -184,7 +184,7 @@ Auto-annotation is a key feature of SAM, allowing users to generate a [segmentat
|
||||
To auto-annotate your dataset with the Ultralytics framework, use the `auto_annotate` function as shown below:
|
||||
|
||||
```python
|
||||
from ultralytics.yolo.data.annotator import auto_annotate
|
||||
from ultralytics.data.annotator import auto_annotate
|
||||
|
||||
auto_annotate(data="path/to/images", det_model="yolov8x.pt", sam_model='sam_b.pt')
|
||||
```
|
||||
|
Reference in New Issue
Block a user