Find out how RPC Fast implemented a hybrid underlayer powered by OVHcloud for cost- and performance-wise Solana dedicated node cluster for Magneta Labs.
Magneta Labs contacted RPC Fast for the Solana dedicated node cluster for blockchain analytics and trading data processing. RPC Fast offers comprehensive DevOps services for medium to large-scale projects by designing, building, and maintaining custom infrastructure solutions. Magneta Labs chose RPC Fast as a node provider because of their experience in blockchain and tailored infrastructure solutions.
In their turn, RPC Fast engineers have chosen OVHcloud to realize the cloud architecture that meets the requirements of the Solana dedicated node cluster.
Implementing monitoring, alerting, and automation solutions for dedicated Solana nodes required an infrastructure upgrade. To bring those changes, RPC Fast engineers planned and implemented the following improvements:
A single Solana node utilizes around 80TB bandwidth per month (40TB in + 40TB out) and demands fast and stable uplinks. This, along with frequent significant memory requirements, shifts the focus from cloud to bare metal in the underlayer choice. Cloud computing has become extremely expensive due to the use of such resources.
RPC Fast experts wanted to be sure there would be no unexpected termination of all hosting services due to an unexpected ToS breach or any other blockchain-unfriendly reason. It may be really hard to get new hardware matching Solana node requirements in 1-2 days to prevent downtime due to unexpected disconnects.
OVHCloud provided RPC Fast with the best resources for an honest price, compared to the other hyperscalers’ offerings:
Hyperscaler | OVHcloud | GCP | Azure | AWS |
---|---|---|---|---|
Name of server/instance | Scale-a3 Dedicated Server | Compute optimized VM c4a-highmem-64 | Compute (E64a v4) | Amazon EC2: r7g.16xlarge |
Data center location | Asia (Singapore) | Asia (Singapore) | Southeast Asia | Asia Pacific (Singapore) |
Processor | AMD EPYC GENOA 9354 - 32c/64t - 3.25GHz/3.75GHz | vCPUs: 64 | vCPUs: 64 | vCPUs: 64 |
Memory | 512GB DDR5 ECC 4800MHz | 512 GiB | 512 GB | 512 GiB |
System Storage | 2x SSD NVMe 960GB Datacenter Class Soft RAID | 1600 GB temporary storage | General purpose SSD (gp3) | |
Storage | 2× 3.84TB SSD NVMe Soft RAID | Boot Disk Hyperdisk Balanced 7680 GiB, $752.64/mo | S60: 8192 GiB, $262/mo | 7,680 GB x 1.00 instance months x $0.096 = $737.28 (EBS Storage only) |
Usage |
Public bandwidth: 1Gbit/s unmetered and guaranteed Private bandwidth: 25 Gbit/s unmetered and guaranteed |
Outbound Data Transfer: 5GB |
Network performance: 30 Gbit Outbound Data Transfer cost (monthly): $102.40 (1TB) |
|
Total instance usage time | 730h per month | 730h per month |
Instance: $4.1344/h On-Demand Monthly: $3018.11/mo (730h per month) |
|
Price | $1,364.99/mo without commitment period | Stats from $3,398.19/mo without commitment period and Boot Disk | Stats from $3,878.40/mo without commitment period | Stats from $3,857.79/mo without commitment period |
Solana nodes may demand significant amounts of RAM for indexes, depending on the RPC methods that were planned to support:
Sometimes it’s also possible to decrease RAM usage by applying filters prior to index generation. Anyway, the hosting provider has to be able to match 2TB RAM requirements, when index filtering is not an option.
There are underlayer requirements from K8s’ side to get all that K8s can do. The most known ones are network storage and load balancers. It’s also preferred to run K8s in the cloud to avoid hardware-specific issues and get new or resize existing nodes in seconds. So RPC Fast engineers needed an extra public cloud environment to implement it.
To utilize the power of K8s in full, RPC Fast planned to maintain open-sourced Solana container images and Solana helm chart to suit our needs at best.
Challenges:
Through our partnership with OVHcloud, we can offer the most efficient servers to our clients costs/performance-wise, especially on Solana, proven OVHcloud’s track record on running Solana nodes. We are also able to sort out any technical issues quickly thanks to direct contact with the OVHcloud team.
The OVHcloud offered the public cloud and bare metal servers interconnect. It allows us to take the best of both worlds—the power of dedicated bare metal, plus the flexibility of cloud for k8s control plane, and low load essentials like monitoring and logging systems. RPC Fast engineers even managed to avoid the drawbacks of bare metal such as new node provision time setting up a bit more nodes alive for a typical workaround.
With the unique features of OVHcloud service and selected servers, RPC Fast engineers deliver the top-tier Solana dedicated node cluster for Magneta Labs reaching the following goals: