Skip to main content

A green, open-source greater than 10B parameter language model.

Project Information

Project Status: Recruiting
Project Region: CAREERS
Submitted By: Chris Hill
Project Email: Cnh@mit.edu
Project Institution: MIT
Anchor Institution: NE-MGHPCC
Project Address: Cambridge, Massachusetts. 02139

Expected project duration (in months): One month
Preferred Start Date: As soon as possible.

Mentors: Chris Hill
Students: Recruiting

Project Description

We are developing a new language model derived from the EleutherAI GPT-Neo initiative ( https://github.com/EleutherAI/gpt-neo ) for application to two projects. These problems need models the skill close to that of the state-of-the art GPT-3 proprietary model. One project is a demonstration of the model for state-of-the-art image captioning, the other project is the publication of the full model as an open community tools for the research community.

For both projects we are interested in collaborating with Cyberteams students to work on model training optimization and testing. The project is looking to run model training and evaluate performance on multi-node configurations of the Aimos 6-GPU/node system. This will allow us to examine scaling and potentially prepare for large experiments with appropriate discussions with IBM teams. The model we will use is efficient and some preliminary work has been undertaken at MGHPCC. Both the RPI and MGHPCC systems have excellent carbon emissions footprints so we also anticipate being able to report energy and emissions statistics that are state-of-the-art for large scale language model training too.

Project Information

Project Status: Recruiting
Project Region: CAREERS
Submitted By: Chris Hill
Project Email: Cnh@mit.edu
Project Institution: MIT
Anchor Institution: NE-MGHPCC
Project Address: Cambridge, Massachusetts. 02139

Expected project duration (in months): One month
Preferred Start Date: As soon as possible.

Mentors: Chris Hill
Students: Recruiting

Project Description

We are developing a new language model derived from the EleutherAI GPT-Neo initiative ( https://github.com/EleutherAI/gpt-neo ) for application to two projects. These problems need models the skill close to that of the state-of-the art GPT-3 proprietary model. One project is a demonstration of the model for state-of-the-art image captioning, the other project is the publication of the full model as an open community tools for the research community.

For both projects we are interested in collaborating with Cyberteams students to work on model training optimization and testing. The project is looking to run model training and evaluate performance on multi-node configurations of the Aimos 6-GPU/node system. This will allow us to examine scaling and potentially prepare for large experiments with appropriate discussions with IBM teams. The model we will use is efficient and some preliminary work has been undertaken at MGHPCC. Both the RPI and MGHPCC systems have excellent carbon emissions footprints so we also anticipate being able to report energy and emissions statistics that are state-of-the-art for large scale language model training too.