FlexFlow is a DNN framework that automatically discovers fast parallelization strategies for distributed DNN training. FlexFlow generalizes and goes beyond today's manually designed parallelization strategies (e.g., data and model parallelism) for distributed DNN training by exploring parallelization opportunities across different Samples, Operators, Attributes, and Parameters. FlexFlow includes a novel execution simulator to evaluate the runtime performance of different strategies and uses an automated search algorithm to discover highly optimized strategies, which generally outperform today's manually designed strategies.

Joint Optimization

FlexFlow uses a novel hierarchical search algorithm to jointly optimize algebraic transformations and parallelization while maintaining scalability.

Learn more

Flexible Parallelization

FlexFlow supports parallelizing DNN training through combinations of the Sample, Operator, Attribute, and Parameter dimensions.

Learn more

Speculative Inference

FlexFlow accelerates generative LLM inference with speculative inference and token tree verification.

Learn more