Tokenization is poised to revolutionize asset creation, trading, and accessibility, yet significant technical and regulatory hurdles remain as the industry strives for widespread adoption. In a recent interview, Rob Holmes, a Web3 and growth strategist, elaborated on the necessary components for achieving scalable tokenization. He highlighted the economic advantages of high-throughput networks like Hedera and emphasized the critical need for regulatory clarity and trust among enterprises.
Holmes pointed out that high-throughput public networks, such as Hedera, can process over 10,000 transactions per second, a stark contrast to Ethereum’s capacity of only 15 to 30 transactions. This capability is essential for scalability in tokenized assets, where minor delays or increased fees can deter adoption. During the decentralized finance (DeFi) boom, congestion costs on Ethereum posed significant challenges for widespread acceptance. According to Holmes, “more capacity means avoiding congestion pricing and keeping transaction fees low and more predictable,” which is especially crucial for lower-value assets like invoices, carbon credits, and fractional shares of real estate.
He noted that this technical capacity could pave the way for rapid growth in fractional ownership models, democratizing access to investments that were previously reserved for a select few. The potential for “real-time or streaming yield or dividend payouts” could enhance capital recycling, thereby improving the overall investment experience, Holmes added.
Institutional interest in tokenization has also surged, as evidenced by recent exchange-traded fund (ETF) filings from major firms such as Grayscale and BlackRock for Hedera (HBAR) spot ETFs. Bloomberg analysts have placed their approval odds at an impressive 90-95%. Additionally, Hedera’s listing by the Depository Trust & Clearing Corporation (DTCC) signifies its readiness for launch pending SEC clearance, which could occur as soon as October, further opening avenues for institutional capital.
On the regulatory front, banks and investment funds seeking to offer tokenized products have faced obstacles due to the absence of harmonized global regulations. Industry leaders like BlackRock and Franklin Templeton have called for clearer definitions regarding digital securities, while regulators in regions such as the EU and Singapore are beginning to introduce pilot programs for tokenized assets. Holmes underscored the importance of balancing regulatory innovation with compliance, stating, “the key is balance,” as overly complex regulations could hinder early-stage innovation.
He cited the Colombian project Suno as a successful model, illustrating how regulatory flexibility can encourage experimentation. By tokenizing small-scale solar farm infrastructure and attracting over 3,500 retail investors, Suno demonstrated market fit and is now preparing for full compliance with EU regulations. Such examples show how “regulatory sandboxes” can facilitate growth before formal licensing is required, as seen in Switzerland and the UK.
Holmes advocates for “graduated paths to compliance,” where lighter regulatory regimes allow for initial experimentation, transitioning to stricter oversight as projects advance. This approach is mirrored in the evolving EU MiCA framework and Japan’s guidelines for security tokens, signaling progress toward global convergence in regulatory standards.
As the industry moves forward, building liquidity and institutional trust remains paramount. Holmes observed that early tokenization efforts largely faltered in delivering on enhanced liquidity, citing issues with infrequent trading of assets. Emerging platforms such as Ondo Finance and Polygon Labs are actively working to create liquidity pools and integrate tokenized offerings with decentralized finance (DeFi) infrastructure to foster greater market participation.
Holmes also emphasized the importance of governance models that reassure enterprises, such as Hedera’s governing council comprised of global businesses, which fosters confidence that no single entity will dominate the network. However, he stressed that foundational trust concerns—security, compliance, and governance—must be adequately addressed before liquidity becomes a critical focus.
In summary, the path to successful tokenization at scale requires a multifaceted approach—leveraging high-throughput networks, fostering regulatory clarity, and ensuring foundational trust among stakeholders. These elements will be essential in shaping the next wave of tokenization and unlocking its potential for broader market accessibility and liquidity.