Eduardo Ordax’s Post

View profile for Eduardo Ordax

🤖 Generative AI Lead @ AWS ☁️ (150k+) | Startup Advisor | Public Speaker | AI Outsider

𝗧𝗵𝗲 𝗲𝘃𝗼𝗹𝘂𝘁𝗶𝗼𝗻𝗮𝗿𝘆 𝗱𝗲𝘃𝗲𝗹𝗼𝗽𝗺𝗲𝗻𝘁 𝗼𝗳 𝗣𝗘𝗙𝗧 To perform task-specific full fine-tuning with LLMs like LLaMA3 70B, a minimum of 5120GB of computational resources may be required. The enormous computational resource requirements are prohibitive for anyone but the superpower players to utilize LLMs for task-specific fine-tuning. To address this challenge, PEFT has emerged as a viable solution to compensate for the tremendous computational cost of full parameter fine-tuning. PEFT updates only a small number of additional parameters or updates a subset of the pre-trained parameters, preserving the knowledge and reducing the risk of catastrophic forgetting. As a result, there has been an increase on PEFT techniques during the last few months. Some of the most popular techniques where many customers are purring lot of effort to improve LLMs for theirs specific purposes are LoRA, QLoRA or Prompt-Tuning; If you want to start easily testing these techniques, I recommend to follow this couple of repositories: 👉 https://lnkd.in/dtvHP_Cn 👉 https://lnkd.in/daKDY4f5 #ai #genai #finetuning

  • diagram
Zara K.

GenAI Engineer | LLM Engineer | Machine Learning Engineer | LLMOps | NLP Engineer | AI/ML Engineer | Deep Reinforcement Learning | Data Scientist | Researcher | MLOps | AI Developer | Deep Learning | Prompt Engineering

1y

PEFT (Parameter-Efficient Fine-Tuning) techniques have emerged as a solution to the high computational costs of full fine-tuning large language models (LLMs), allowing for task-specific adaptation without requiring extensive resources. PEFT methods, such as LoRA, QLoRA, and Prompt-Tuning, update only a small number of parameters, preserving pre-trained knowledge and reducing the risk of catastrophic forgetting. These techniques have gained popularity in recent months, enabling more users to customize LLMs for specific tasks without prohibitive computational costs.

Martin Fjeldbonde

AGI-Proofing Audit | Founder of SpockBench | Partner & Nordic COO, Deloitte Audit & Assurance

1y

Great advice! The evolution of PEFT will be an interesting journey, one that should be approached with both optimism and scrutiny. The reliance on a smaller set of parameters to adapt to specific tasks could lead to overfitting, especially when dealing with limited or biased datasets, so important to approach this with caution

Dr. Yogesh Malhotra

Grok AI: “Singular Post AI-Quantum Pioneer for decades of cohesive innovation adding trillions in value in adaptive finance and risk systems decades ahead of today’s AI in breadth, depth, and practical impact.”

1y

Nature Magazine: #GenAI-#LLMs: #PEFT for #PLM's: Parameter-efficient fine-tuning of large-scale pre-trained language models. HTML https://lnkd.in/eX6GpxRB : PDF https://lnkd.in/ePDtVGFe . #PEFT: Parameter-Efficient Fine-Tuning (PEFT) for Large Models: A Comprehensive Survey: https://lnkd.in/e6yVCEGF PDF: https://lnkd.in/eAD48d_x . Hugging Face Notebooks: #Fine-#tune #pretrained #model: #Colab [Mixed: https://lnkd.in/eyqDc9ZD ; PyTorch: https://lnkd.in/efc2wVQE; TensorFlow: https://lnkd.in/eb83uN-e ] #StudioLab: [Mixed: https://lnkd.in/eKCnhudR ; PyTorch: https://lnkd.in/eDvPuDfm; TensorFlow: https://lnkd.in/eAUEYyKC] #PromptEngineering: #ChatGPT #C - #Perplexity #P - #You #Y How to #Assess #Strengths & #Limitations of various PLMs and diverse PEFTs for PLMs?: AIMLExchange.com : https://lnkd.in/eBSgQRqi #C https://lnkd.in/eae8MtVu #P https://lnkd.in/ea8bhKVq #Y https://lnkd.in/efFw9xWr How to resolve model overfit & underfit for PLMs using PEFTs to balance bias-variance tradeoffs? AIMLExchange.com : https://lnkd.in/eZSq-fKs #ChatGPT https://lnkd.in/ea7FCugd #Perplexity https://lnkd.in/erqAqNFw #You https://lnkd.in/eq2mzQyu Source: https://www.linkedin.com/posts/yogeshmalhotra_genai-llms-peft-activity-7211023325601411072-1bB0

See more comments

To view or add a comment, sign in

Explore content categories