Non-autoregressive Transformer (NAT) is an efficient type of image synthesis model. However, their performance is generally inferior to state-of-the-art image generation models (e.g. diffusion models) ...
We introduce phi-1, a new large language model for code, with significantly smaller size than competing models: phi-1 is a Transformer-based model with 1.3B parameters, trained for 4 days on 8 A100s, ...
ecommerce_analysis/ ├── config/ │ ├── __init__.py │ └── logger.py # Configuración central de logs ├── data/ │ └── dataset.csv ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results