1 min readfrom Machine Learning

You can decompose models into a graph database [N]

https://github.com/chrishayuk/larql

https://youtu.be/8Ppw8254nLI?si=lo-6PM5pwnpyvwMXh

Now you can decompose a static llm model and do a knn walk on each layer (which was decomposed into a graph database), and it's mathematically identical to doing matmult. It allows you to update the models internal factual knowledge without retraining (just insert into graph DB), it also uses less memory (since its just a database). The creator is the CTO at IBM.

submitted by /u/Educational_Win_2982
[link] [comments]

Want to read more?

Check out the full article on the original site

View original article

Tagged with

#rows.com
#graph database
#decompose
#static llm model
#knn walk
#matmult
#factual knowledge
#retraining
#memory usage
#database
#model update
#IBM
#CTO
#machine learning
#internal models
#insertion
#layer decomposition
#knowledge representation
#data structure
#algorithm