Creating Your Own LLM From Opensource Models

code red code red

From a "simple" fine-tuning to your own Mixture of Expert model using opensource models.

Nowadays training from scratch an LLM is a so huge effort also for very big company. Starting from pre-trained models to create your own model is no more a way for resourceless companies, but a often a must starting point.

  • Lora
  • Quantization and QLora
  • Injecting embeddings model into Lora to manage multiple Lora adapters.
  • Mixing models
  • Creating your MoE (Mixture of experts) model using several finetuned (Your own) models


Sebastiano Galazzo

CTO @Synapsia AI, Winner of Three AI Awards, 25 Years Working in AI and ML

Winner of three AI awards, I’ve been working in AI and machine learning for 25 years, designing and developing AI and computer graphic algorithms.

I’m very passionate about AI, focusing on Audio, Image and Natural Language Processing, and predictive analysis as well.
I received several national and international awards that recognize my work and contributions in these areas.

Microsoft MVP for Artificial Intelligence Category, I have the pleasure of being a guest speaker in national and international events.

Read more
Find Sebastiano Galazzo at:


Thursday Sep 26 / 03:40PM CEST ( 50 minutes )


Ballroom A