2024: Testing AI algorithms for the automatic extraction and application of guitar effects to enable new product lines
Recreating guitar tones from existing music is a time-consuming and technically difficult task for musicians. The goal of this project was to test and validate AI-based methods for automatically extracting guitar effects from songs and applying them to new performances. The system was divided into three functional blocks, each validated with different neural architectures. For source separation (Block 1), UVR and Moises.AI models gave promising results in isolating guitar tracks. For clean signal reconstruction (Block 2), a combination of Mel-Denoiser and HiFi-GAN showed potential but remains the key bottleneck. In Block 3, GCNTF was validated for modeling and reapplying effects to new guitar inputs. A web interface was adjusted for testing, and a command-line tool was developed for delay modeling. The project successfully demonstrated the feasibility of the core concept and identified key areas for improvement. This work lays the foundation for future development toward real-time, AI-driven guitar tools.