Premium

AWS and Hugging Partner Unite to Enhance and Democratize Machine Learning and Artificial Intelligence Models

Amazon Web Services (AWS) is teaming up with machine learning (ML) developer Hugging Face to improve access to ML models and lower development costs. The long-term, non-exclusive partnership will enable developers to create and train next-generation ML models through Hugging Face on AWS. Both companies hope the alliance will help democratize ML development in the future.

Hugging Face will use AWS as its preferred cloud provider, empowering its developer community with the service’s array of artificial intelligence (AI) utilities. Researchers and developers can build, train, and deploy ML models using SageMaker, the AWS-based ML platform. Models are hosted on AWS’ elastic compute cloud (EC2), its scalable computing service that automates ML training through its dedicated cloud infrastructure and hardware.

Hugging Face cites two unique AWS tools developers will love: AWS Trainium, its second-generation deep learning chip, and AWS Inferentia, accelerators designed to enhance deep learning performance. The startup also hopes the partnership will make its open-source suite of pre-trained natural language processing (NLP) easier to deploy and scale. In an announcement blog post, it was noted that both companies hope to “contribute next-generation models to the global AI community and democratize machine learning.”