Categories
Uncategorized

Dev QR Snap – S0, Complete

Update 6/13/23:

QR Snap has made some substantial hardware upgrades to ensure the reliability and redundancy of it’s data whilst improving performance. Firstly a second Dell R730 was added to host this blog and provide a PostgreSQL replication and backup server.

New QRSnapS1 Server Specs

  • Dell R730 Chassis
  • 2x Intel Xeon 2620v3 x2 (weak but it’s all I need)
  • 288GB DDR 4 2133 Memory
  • Proxmox VE
  • Intel X520 10Gbit fiber link
  • 2x 870 EVO SSDs RAID 0 (working on getting raid 10)

Upgraded QRSnapS0 Server Specs (Primary QR Snap Server)

  • Dell R730 Chasis
  • 2x Intel Xeon 2699v4 (22 core)
  • 1.5TB DDR4 2400 Memory
  • Centos 9 Stream OS
  • 3x 3.2TB Seagate NYTRO SAS SSD Enterprise grade. (RAID 0)
  • 3x 18TB WD HC550 SAS HDDs (RAID 5)
  • 2x 2TB Samsung 870 EVO (LVM2 cache for the raid 5)

QR Snaps network infrastructure has also been upgraded as well, all servers are now linked with OM3 at 10Gbit full duplex. With a Ziply fiber 500/500 WAN on a static IP Address.

The software stack has also been upgraded to account for higher levels or linear scaling by using HAProxy, Nginx, PostgreSQL as the database and load balancers for QR Snaps traffic. The software stack keeps evolving to further take advantage of the multithreaded nature of the server hardware I have. Some examples of this are using Python instead of NodeJS to generate QR Codes. This is because Python handles threading via the use of queue objects much better than NodeJS which is by nature a single threaded executable.

ORGINAL POST

Finally after two months the QR Snap S0 Server is complete. In the infancy of QR Snap our service was hosted from a desktop computer running Node and PostgreSQL over WiFi. Overall this setup worked incredibly well in hindsight given how many points of failure existed. So what has changed since then? The answer is a lot. From software to hardware nearly everything has been revamped.

The biggest change would be the hardware used to host QR Snap. The original QR Snap host PC had the following specs.

  • AMD Ryzen 1700
  • 32GB DDR4 2866
  • 256GB Samsung 970 Evo NVME Drive
  • 12TB Western Digital Gold
  • Nvidia GeForce 970
  • TP Link T9UH
  • Windows 10 Pro

This setup was the humble beginnings of QR Snap and it worked quite well from a functional standpoint. Thought my humble hosts limitations were quickly realized. Firstly this PC was my workstation and as such was very venerable to getting updated, reset, crashing, and power outages. From the perspective of uptime I am amazed it did so well. The number of major service disruptions could be counted on one hand. The biggest issues were related to Windows updates and reconnecting to the WiFi after a reboot. Anther issue that was resolved in a nicer manner was power outages and recovery from those event’s.

Dell R730, the next generation of QR Snap hosting.

After some discussions about the shortcomings of the current setup for QR Snap services with a friend. A gracious offer to sell me some actual server grade hardware was made. I had no idea what these servers were but at 2.9mBTC each. The reliability provided by server grade hardware combined with the offloading the webserver to dedicated hardware would be extremely beneficial when implemented. The key word as always is ‘implementation’ this step would prove to be one of the most tedious processes I have ever experienced in my life. At the same time though one of the best learning experiences awaited and the payoff was massive.

As soon as I got these shiny new chassis home the lids were off. checking for parts and configuration options as well as overall condition. These were dell R730 servers so it seemed more modern than I anticipated and they each had 9 memory modules installed for a total of 144GB per machine. Each one had 2x spinning hard drives installed, a Intel E5 2620v3 CPU, and in Intel x520 networking card with 2x 10gbe SFP+ ports. The next step was to open each cassis and merge them into a single unit with a dual CPU configuration with 288GB DDR4 memory. 4x Samsung 870 Evo dives were added in raid 0 for database storage and Open Street Map tile rendering. CPUs were upgraded to Dual Xeon E5 2699 V4s. A 256GB SSD was added as a swap drive. Three 18GB WD HC550s were added for the caching of open street map tiles at lower zoom levels and redundancy. Overall this server was build to parallelism and work.

8TB Samsung 870 Evo Drives

After harvesting a whole chassis and building up the R730 a choice of which OS was to be used needed to be made. The first choice was Ubuntu do to the ease of access, user friendliness, plethora of tutorials, and massive community support. After installing Ubuntu and having this overarching feeling of being a ‘noob’ flow over me. The decision was made to utilize a RHEL based OS for ‘Enterprise’ grade reliability. Doing away with the plebian Ubuntu for the more robust Centos the pain began immediately. Upon booting from USB the centos installer had issues where upon clicking the field to setup update repos would create a condition where a reboot was necessary to get out. Second mistake to a degree was choosing the ‘Stream’ version of centos. Looking back using Centos Stream 9 only created pain in the short term. Long term rolling updates are what I want and need. After install once, Centos was reinstalled due to how the installer split the users home directory onto it’s own partition. Though useful for some cases for QR Snap a homogenous partition would let me get up and running faster and reduce the risk of running out of space. With a flew clicks and reboots QR Snap was now running with Red Hat Enterprise at it’s core.

The moment of truth, the trials of my manhood would now be upon me. The monitor was unplugged, server moved to the closet, and ethernet cable firmly attached. Moving back to my windows host a terminal was fired up and an SSH connection was made. Voila; QR Snap now has a giant loud fast Raspberry Pi to run on.

Note this is my first real foray into using Linux and there is no better way to learn than to be thrown right in to the fire. Centos uses the yum(dnf) package manager and Steam 9’s repositories are sparse to say the least. When compared to Ubuntu’s apt it feels like a ghost town, and later I would find out Centos Stream 9 repository depth is extremely small compared to the non stream versions of Centos. Many server utilities were missing without a repository to fall back on.

Leave a Reply

Your email address will not be published. Required fields are marked *