Skip to content

Commit

Permalink
Create concept guide section (huggingface#16369)
Browse files Browse the repository at this point in the history
* ✨ create concept guide section

* πŸ– make fixup

* πŸ– apply feedback

Co-authored-by: Steven <[email protected]>
  • Loading branch information
stevhliu and Steven authored Mar 25, 2022
1 parent ed2ee37 commit b320d87
Show file tree
Hide file tree
Showing 8 changed files with 113 additions and 815 deletions.
76 changes: 37 additions & 39 deletions docs/source/_toctree.yml
Original file line number Diff line number Diff line change
Expand Up @@ -5,10 +5,6 @@
title: Quick tour
- local: installation
title: Installation
- local: philosophy
title: Philosophy
- local: glossary
title: Glossary
title: Get started
- sections:
- local: pipeline_tutorial
Expand All @@ -17,30 +13,20 @@
title: Load pretrained instances with an AutoClass
- local: preprocessing
title: Preprocess
- local: task_summary
title: Summary of the tasks
- local: model_summary
title: Summary of the models
- local: training
title: Fine-tune a pretrained model
- local: accelerate
title: Distributed training with πŸ€— Accelerate
- local: model_sharing
title: Share a model
- local: tokenizer_summary
title: Summary of the tokenizers
- local: multilingual
title: Multi-lingual models
title: Tutorials
- sections:
- local: fast_tokenizers
title: "Use tokenizers from πŸ€— Tokenizers"
- local: create_a_model
title: Create a custom model
- local: multilingual
title: Inference for multilingual models
- local: troubleshooting
title: Troubleshoot
- local: custom_datasets
title: Fine-tuning with custom datasets
title: Create a custom architecture
- local: custom_models
title: Sharing custom models
- sections:
- local: tasks/sequence_classification
title: Text classification
Expand All @@ -65,47 +51,59 @@
title: Fine-tune for downstream tasks
- local: run_scripts
title: Train with a script
- local: notebooks
title: "πŸ€— Transformers Notebooks"
- local: sagemaker
title: Run training on Amazon SageMaker
- local: community
title: Community
- local: multilingual
title: Inference for multilingual models
- local: converting_tensorflow_models
title: Converting Tensorflow Checkpoints
title: Converting TensorFlow Checkpoints
- local: serialization
title: Export πŸ€— Transformers models
- local: performance
title: 'Performance and Scalability: How To Fit a Bigger Model and Train It Faster'
- local: parallelism
title: Model Parallelism
- local: benchmarks
title: Benchmarks
- local: migration
title: Migrating from previous packages
- local: troubleshooting
title: Troubleshoot
- local: debugging
title: Debugging
- local: notebooks
title: "πŸ€— Transformers Notebooks"
- local: community
title: Community
- local: contributing
title: How to contribute to transformers?
- local: add_new_model
title: "How to add a model to πŸ€— Transformers?"
- local: add_new_pipeline
title: "How to add a pipeline to πŸ€— Transformers?"
- local: fast_tokenizers
title: "Using tokenizers from πŸ€— Tokenizers"
- local: performance
title: 'Performance and Scalability: How To Fit a Bigger Model and Train It Faster'
- local: parallelism
title: Model Parallelism
- local: testing
title: Testing
- local: debugging
title: Debugging
- local: serialization
title: Exporting πŸ€— Transformers models
- local: custom_models
title: Sharing custom models
- local: pr_checks
title: Checks on a Pull Request
title: How-to guides
- sections:
- local: philosophy
title: Philosophy
- local: glossary
title: Glossary
- local: task_summary
title: Summary of the tasks
- local: model_summary
title: Summary of the models
- local: tokenizer_summary
title: Summary of the tokenizers
- local: pad_truncation
title: Padding and truncation
- local: bertology
title: BERTology
- local: perplexity
title: Perplexity of fixed-length models
- local: benchmarks
title: Benchmarks
title: Research
title: Conceptual guides
- sections:
- sections:
- local: main_classes/callback
Expand Down
2 changes: 1 addition & 1 deletion docs/source/create_a_model.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express o
specific language governing permissions and limitations under the License.
-->

# Create a custom model
# Create a custom architecture

An [`AutoClass`](model_doc/auto) automatically infers the model architecture and downloads pretrained configuration and weights. Generally, we recommend using an `AutoClass` to produce checkpoint-agnostic code. But users who want more control over specific model parameters can create a custom πŸ€— Transformers model from just a few base classes. This could be particularly useful for anyone who is interested in studying, training or experimenting with a πŸ€— Transformers model. In this guide, dive deeper into creating a custom model without an `AutoClass`. Learn how to:

Expand Down
Loading

0 comments on commit b320d87

Please sign in to comment.