<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd" xmlns:googleplay="http://www.google.com/schemas/play-podcasts/1.0"><channel><title><![CDATA[Unize]]></title><description><![CDATA[Updates for Unize, a venture that helps people and AI work with knowledge and achieve goals]]></description><link>https://blog.unize.org</link><generator>Substack</generator><lastBuildDate>Thu, 16 Apr 2026 01:01:35 GMT</lastBuildDate><atom:link href="https://blog.unize.org/feed" rel="self" type="application/rss+xml"/><copyright><![CDATA[Brendon Wong]]></copyright><language><![CDATA[en]]></language><webMaster><![CDATA[unize@substack.com]]></webMaster><itunes:owner><itunes:email><![CDATA[unize@substack.com]]></itunes:email><itunes:name><![CDATA[Brendon]]></itunes:name></itunes:owner><itunes:author><![CDATA[Brendon]]></itunes:author><googleplay:owner><![CDATA[unize@substack.com]]></googleplay:owner><googleplay:email><![CDATA[unize@substack.com]]></googleplay:email><googleplay:author><![CDATA[Brendon]]></googleplay:author><itunes:block><![CDATA[Yes]]></itunes:block><item><title><![CDATA[Introducing Unize Storage]]></title><description><![CDATA[An AI system to generate high-quality knowledge graphs at scale.]]></description><link>https://blog.unize.org/p/introducing-unize-storage</link><guid isPermaLink="false">https://blog.unize.org/p/introducing-unize-storage</guid><dc:creator><![CDATA[Brendon Wong]]></dc:creator><pubDate>Thu, 26 Sep 2024 09:07:06 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!lauK!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7610507-7a29-4363-8c2c-a60e7b9735dd_1920x820.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><a href="https://www.unize.org">Unize</a> helps people and AI harness knowledge and thinking to achieve goals. Today, we&#8217;re excited to announce we&#8217;ve developed a new type of AI system capable of building knowledge graphs at any scale&#8212;meaning any length of input, any number of inputs, and any size of destination graph. We&#8217;re calling this system Unize Storage. Our alpha release, Unize ST-0.5, is available now in<a href="https://developers.unize.org/"> our API</a>.</p><p>LLMs have limited-size &#8220;context windows&#8221; that cap the amount of input they can work with. LLM performance<a href="https://arxiv.org/abs/2404.02060"> degrades rapidly</a> on more challenging tasks as more input is added. This makes their &#8220;effective context windows&#8221; much smaller in certain cases, including knowledge graph generation. Regardless of the exact input cutoff, context windows introduce a problem when using LLMs to generate graphs, where the graph generated for one set of text added to a context window (commonly called a &#8220;chunk&#8221;) conflicts with the graph generated from another chunk.</p><p>All LLMs, and most frameworks built on top of them to generate graphs, suffer from this problem. Aside from Unize ST-0.5, there is one other system we know of that attempts to address this challenge, LangChain&#8217;s LLMGraphTransformer module. It was created by the AI team at Neo4j, the leading graph database company.</p><p>Perhaps due to the previous impossibility of larger-scale graph generation, we were unable to find suitable benchmarks, so we created our own called <a href="https://developers.unize.org/kgstorage">KGStorage</a> with a single-input dataset of 5,000 characters and a multi-input dataset of 20,000 characters.</p><p>We ran Unize ST-0.5, LangChain&#8217;s LLMGraphTransformer, and GPT-4o (selected to represent unmodified LLM output) against KGStorage with the following results:</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!lauK!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7610507-7a29-4363-8c2c-a60e7b9735dd_1920x820.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!lauK!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7610507-7a29-4363-8c2c-a60e7b9735dd_1920x820.png 424w, https://substackcdn.com/image/fetch/$s_!lauK!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7610507-7a29-4363-8c2c-a60e7b9735dd_1920x820.png 848w, https://substackcdn.com/image/fetch/$s_!lauK!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7610507-7a29-4363-8c2c-a60e7b9735dd_1920x820.png 1272w, https://substackcdn.com/image/fetch/$s_!lauK!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7610507-7a29-4363-8c2c-a60e7b9735dd_1920x820.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!lauK!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7610507-7a29-4363-8c2c-a60e7b9735dd_1920x820.png" width="1456" height="622" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/c7610507-7a29-4363-8c2c-a60e7b9735dd_1920x820.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:622,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:115781,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!lauK!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7610507-7a29-4363-8c2c-a60e7b9735dd_1920x820.png 424w, https://substackcdn.com/image/fetch/$s_!lauK!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7610507-7a29-4363-8c2c-a60e7b9735dd_1920x820.png 848w, https://substackcdn.com/image/fetch/$s_!lauK!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7610507-7a29-4363-8c2c-a60e7b9735dd_1920x820.png 1272w, https://substackcdn.com/image/fetch/$s_!lauK!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7610507-7a29-4363-8c2c-a60e7b9735dd_1920x820.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">KGStorage Results on Multi-Input Dataset</figcaption></figure></div><p>Unize ST-0.5 achieves state-of-the-art results on KGStorage&#8217;s multi-input dataset, topping critical measures like node accuracy by wide margins. These correctness metrics indicate that most nodes, relationships, and associated data accurately reflect the source text. LangChain&#8217;s performance is much closer to Unize ST-0.5 on KGStorage&#8217;s single-input dataset, achieving 89% node accuracy compared to Unize ST-0.5&#8217;s 96%, while GPT-4o scores 43%. This indicates LangChain provides value on top of raw LLM output on smaller inputs, though its performance declines significantly on larger inputs. You can read our full results and methodology <a href="https://developers.unize.org/kgstorage">here</a>.</p><p>While Unize ST-0.5&#8217;s performance results are strong, please keep in mind it&#8217;s an alpha, and performance is not yet close to human level. Improvements are actively in the works though!</p><p>You can sign up for Unize Storage <a href="https://developers.unize.org/">here</a>. Our user-friendly playground allows developers and non-developers to easily experiment with the system, and we are currently giving away free API credits while supplies last. Please get in touch if you have any thoughts or feedback!</p>]]></content:encoded></item></channel></rss>