Post

Release Notes - Compression, Caching and Bot Protection

Overview

Behind the scenes updates were made today that won’t be obvious to users but will make Linkidex more performant, and more resilient to high traffic from both real users and evil robots.

Compression

When possible, Linkidex now uses gZip or Brotli compression when communicating with clients. This significantly reduces the filesize of loading larger assets, such as the Linkidex React Application. In my personal testing, I saw a reduction in filesize of around 75%. We do this by utilizing Amazon CloudFront.

Long story short Linkidex now loads faster, especially over slower internet connections. Linkidex also uses less data to do so.

Caching

I have updated Linkidex to start using a caching layer. Photos and large files such as the Linkidex React Application can be served directly from the cache. Grabbing data from the cache will be faster than needing to go all the way to Linkidex servers to grab information. This also reduces demand on Linkidex servers, as they can focus on requests that can’t be cached (such as users creating or editing links). The result will be faster load times. This also uses Amazon CloudFront.

Bot Protection

I have enabled Amazon’s Web Application Firewall for Linkidex. This will help reduce performance degradation if / when bots or malicious actors start blasting our site. Even if they don’t pose a security threat, blocking this traffic is beneficial as it wastes bandwidth real users could be using instead.

Thanks for your continued support!

~ David