How is this Website so fast!? — Breaking down the McMaster Carr website and the techniques they use to make it so dang fast
Posted by h4l@reddit | programming | View on Reddit | 366 comments
DesiOtaku@reddit
When I had a few people test out my website, I actually had a few complaints that it was "too fast". It turns out that if your website loads instantly, some people will see it as a red flag. It's kind of funny how people actually expect websites to take a while to load.
TravisJungroth@reddit
People respond differently when you’re asking for feedback, and for some reason this sounds like one of those. And, no one who uses McMaster is going to see it as sketchy. It’s like ultimate brand authority in its market.
DesiOtaku@reddit
Also to add to that, in the context of my website, it as taking in user input and responding back instantly. So a lot of people thought that their information was being lost.
CatolicQuotes@reddit
What was the feedback response after input submit?
DesiOtaku@reddit
"Did it go though?" "Did it really save what I wrote?"
Most websites take a solid second from hitting the button to seeing the next screen. I could add in a "Saving..." screen but that's extra work I am not willing to do right now.
CatolicQuotes@reddit
Sorry, I mean what feedback did website give to the user after submit?
DesiOtaku@reddit
It just shows the next page. You can try it out yourself:
https://clear.dental/newPatientDemo/
CatolicQuotes@reddit
thanks, I filled the form and got absolutely no feedback at the end, just a blank screen. I would question the same. It's huge form and nobody wants to do this twice.
Also there is no feedback after each step. Some questions where we just click button is ok, but longer form should have feedback that data is received. Also going back on some steps ask me to confirm resubmission. I'd like to just have form with answers I typed in case I wanna change something.
I don't think speed is the problem, but lack of user experience.
DesiOtaku@reddit
Oh, that last page is actually a bug (I recently moved the demo server and I forgot to set the permissions). I was actually talking about page to page; because each page showed up instantly, they were worried everything in the previous screen was lost.
And yeah, the back button is a little bugged as well.
CatolicQuotes@reddit
Yeah, add some mini alert at the top saying data 3rd page saved, anything really. Remember, this is more than coding. This is communication. It is really fast. What's the stack?
DesiOtaku@reddit
Just PHP. It saves it as a .json file (hence the permissions issue). I am open to PRs.
https://gitlab.com/cleardental/cleardental-newpatientportal-web-client
timeshifter_@reddit
I've experienced the same thing. I custom built the entire system that ran that company, and as a result, everything was very tightly integrated, and we had it hosted on a pretty beefy VPS, considering its typical load. Round-trip times were often barely over 100ms, and when it came to security-related issues, it was so fast that users weren't convinced it was actually working. I ended up just putting an artificial delay in so it "felt" like the system was actually doing something... oh the irony.
CatolicQuotes@reddit
What was the feedback response after calculatio is done?
timeshifter_@reddit
No additional comments one way or the other.
CatolicQuotes@reddit
no feedback to the user I mean? notification calculation done or something?
timeshifter_@reddit
Oh, that didn't change, it still displayed a confirmation message that the task completed. It just took slightly longer.
SegFaultHell@reddit
This is a pretty common thing too. Tax websites do this all the time, once you’ve put all your info in the computer could give you a result in under a second easy. They have you stare at a loading screen for 30+ seconds though because users don’t trust the results if it’s too fast.
In general any sort of calculation that is perceived as difficult or complicated by an end user can actually benefit from artificial delays so consumers will trust it. Just make sure you program a flag for it so you can turn it off for your account and have a speedy experience lol
rz2000@reddit
Freetaxusa doesn’t do this nonsense.
spinwizard69@reddit
That is a design problem on your part. People should get confirmation that a transaction actually completed correctly.
I've actually have come across this on some sites and you are left wondering if things happened as you expected.
SiegeAe@reddit
Yeah also if the issue is they still don't believe the confirmation just show more of what they previously input in it, done right the quickness would leave people with a good impression of "how easy" the process felt
I honestly think the standard 95% <2sec bar is too low and we need to put more pressure on the standards to become external network overhead + <100ms (that and actually do UX design that isn't just UI prettiness, some apps have like 30 steps for one concept)
Longjumping-Mud1412@reddit
Yea, I was not expecting to see McMaster on this sub
southseasblue@reddit
wtf is htat font, loks fake
intermediatetransit@reddit
I’ve had to add fake spinners more than once to make customers more satisfied. It’s a thing.
ZebusAquaion@reddit
Phones ringing when you make a call with a smart phone was artificially added. Same thing with electric cars are very silent. Some add artificial sound back in.
Karl_Satan@reddit
Yes, but these provide feedback. If your smartphone didn't ring, you wouldn't know if a call connected or not without waiting to hear the other line or looking at your phone. The amount of times I've sat waiting to hear the ring when making a call only to realize, for whatever reason, the call did not connect, is quite annoying.
With electric cars, it's much the same. There's also the "this big moving thing could kill you aspect"
UloPe@reddit
That’s the stupidest thing I’ve heard all week
intermediatetransit@reddit
Humans aren’t always rational, that’s how it goes.
zmeme@reddit
dude i felt my brain resist because the website was too fast surreal
smartello@reddit
In only makes sense if the pages you navigate between are very different and their brain struggled with a sudden scene change. Addition of a latency is not the right way to respond to this feedback!
DesiOtaku@reddit
Tell me what you think (don't worry, the data won't be saved so you can write anything):
https://clear.dental/newPatientDemo/
icedrift@reddit
I get this. Extremely snappy websites feel like they must be lacking in something. I know this isn't the case but the feeling is there. It's kind of like how an EV feels like it shouldn't be that fast because we're so used to loud engines being associated with power.
spinwizard69@reddit
Interesting point. There is actually a segment of the ICE vehicle using community that makes their cars loud on purpose as that supposedly indicates performance. Then there is the rest of use that think these guys are ignorant and couldn't drive a high performance car if their life depended upon it.
Ran4@reddit
You tend to drive cars faster the less noise they make.
MINIMAN10001@reddit
That was me when I first tried Linux, was playing around with a VM for fun.
I couldn't get over how snappy everything was.
On Windows everything takes times when you open and click on things.
It felt like it was taking action before I expected it to start doing anything and it just felt weird.
spinwizard69@reddit
Windows has just gotten worse over time. My company tried MS Surface laptops for a bit and the behavior of Windows on those machines was terrible. This especially when I had an M1 AIR to compare against with the M1 effectively running a cell phone processor. Windows is somewhat better on newer hardware but you still end up wondering if anything is happening after clicking on an icon.
FatStoic@reddit
Windows has gotten noticeably shitter in my lifetime whilst linux has gone from a hobbyist's desktop to something I'd recommend for anyone who just wants a computer that goes to the internet. If the enshittification continues whilst desktop applications become less relevant I can see a scenario where Microsoft lose the casual desktop consumer market.
lt947329@reddit
Have you tried file explorer in Windows 11? It’s noticeably, painfully slow. I got worried at first and ran diagnostics on my storage and RAM because I thought my hardware was failing…
FatStoic@reddit
Yeah it's absolutely shocking. On a linux system you can realistically search all the filenames on your machine. On windows even a large folder of photos can grind down. I don't know how it can be so inefficient.
minderaser@reddit
God it drives me insane. I've had a bug for years now with taskbar icons randomly disappearing. Also why does switching desktops take 2.5-3 seconds? (and that is with a 5900x/rtx 3070) That is real time on macOS with touchpad gestures.
pjjaoko@reddit
Page transitions and CSS animations can slow things down in a nice way that lets the user know changes are just about to load.
acc_agg@reddit
Story time, I spent 5 years in big tech building back end services. I moved to a start up with a friend and he gave me a tar file with all the code for the project. I opened and build it in 20 seconds. First thing I did was walk over to his desk to ask what I was screwing up for the build dependancies. Turns out it was nothing and it just built that fast. When you don't try to be everything for everyone you can actually do good work.
itsmontoya@reddit
I usually add CSS delays when I can to give the impression of more time being taken.
Fiennes@reddit
TLDR it wasn't made by morons. The techniques have been known for decades, it's just most modern websites ignore it and throw shit down the pipe.
phillipcarter2@reddit
Most websites throughout history have ignored these practices too! For every bloated JS app you see today, there was some monstrosity stuffing enormous amounts of garbage into ASP.NET's ViewState years ago.
pyabo@reddit
That was me. I was doing that.
jeremiah15165@reddit
Same here brother, I was loading the entire site into view state on some apps.
pyabo@reddit
The craziest thing about ASP.NET was how well it all worked. One giant hack on top of a stateless, connectionless protocol, to make it feel like the Windows event-driven desktop programming model. Just a beautiful fiasco and it was brilliant.
realfranzskuffka@reddit
can you elaborate a bit more on this?
dbalazs97@reddit
i was also interested so asked our friend: https://chatgpt.com/share/689f83ef-0014-8007-b182-8a4f6eaa1c9c
realfranzskuffka@reddit
thanks for sharing, I appreciate it.
littlemetal@reddit
An entire dataframe with the whole table, no problem!
gonzofish@reddit
Can’t frontenders have anything?!?
-Knul-@reddit
Maybe I'm a moron but I didn't know you could do DNS prefetch.
michael2725@reddit
Agreed, too many people focus on UI because they assume it must be updated. For real tools it doesn't need to be. Ever seen a
giga@reddit
Yep and a lot of business owners have a set of priorities that is opposite to being fast. Mainly: ads.
Ads often mean loading a bunch of external code that almost actively fights your site’s performance.
I didn’t look into the site mentioned in the video but it wouldn’t surprise me if it had 0 ads. And good for them for having their priorities straight.
spinwizard69@reddit
When you have a business like McMaster you don't have to worry about ads. The whole point of your business is to serve up your inventory to other businesses. McMaster retains its leadership by offering a service that few can match.
If one takes the time to compare McMasters site to MSC or Grainger and you will see just how good McMasters site is. There have been many cases where i give up at alternative vendors and just go with McMaster.
This isn't to say McMaster is perfect, sometimes it is important to know a brand that you are buying. McMaster simply doesn't focus on who is actually making things (never has really) that you want to buy. sometimes it is important.
Murky-Relation481@reddit
Yep, shop people did not like when we started instituting materials traceability at work for flight hardware and they couldn't just order random stuff off McMaster anymore and had to get a purchasing agent to actually order stuff from manufacturers directly with certs.
throwaway490215@reddit
Its not about showing ads on your page, its about having ads run elsewhere that lead to your page, and integrating with that ad platform to see how well your ads are doing.
kog@reddit
The only reason a site loads slowly because of ads is incompetence
gimpwiz@reddit
Mcmaster is awesome if you machine anything.
centurijon@reddit
My company doesn’t have ads. What it does have is a marketing team that feels like they need to know EVERYTHING about what our clients are doing, right down to mouse movements, eye tracking, and brainwave scanning
4THOT@reddit
Nothing about ads requires a site to be slow, stop making excuses for incompetence.
mccoyn@reddit
Early in the Internet some websites would mis-report the number of impressions an ad got. Ad companies eventually required their own analytics and tracking to verify the number of impressions. You end up with the web page loading the ad, then the ad loading the analytics. It’s these layers of round-trip loads that often make pages with ads snow.
4THOT@reddit
Yea, I was on the internet since the late 90's. Nothing you've said contradicts anything I'm saying.
Everything you're talking about is measured in hundreds of milliseconds and is just another excuse for a dogshit user experience and not caring because "oh well it's just going to get slowed down by marketing (as if marketing is actually implementing analytics, who the fuck do you think you're fooling?) who cares if it takes 8 seconds to load".
The perfect example is the experience between new.reddit.com and old.reddit.com.
Reddit is primarily an ad-driven platform. Why isn't old.reddit taking 5 years to load with all these analytics hooks and ad trackers? Why is it orders of magnitudes faster than the typical parallax smooth scrolling UI/UX uwu in bio TODO React "web app" that takes years to load?
It's just an excuse for incompetence.
Show me how analytics can add more than 400 milliseconds to a page load that isn't just gross incompetence.
aveman101@reddit
Ads are where the money is, though. 😕
mccoyn@reddit
Not if you can charge McMaster prices.
MrWilsonAndMrHeath@reddit
People go to McMaster to order thousands of dollars of parts and have it delivered asap. If they put ads in the way of their convenience, they’d make a lot of engineers annoyed and lose their customer base.
spinwizard69@reddit
Not for a company like McMaster. Their money is in fast response to customer needs which results in sales. The people that use McMaster know what they want and fast is the only thing that counts.
uptimefordays@reddit
Depends on your industry. For digital media? Maybe. For a parts supplier, the money comes from selling parts.
mycall@reddit
I could imagine wasm ads would be slightly faster. Is that a thing yet?
MINIMAN10001@reddit
So it's not a thing.
But why would that be faster an advertisement at the end of the day is a image with a hyperlink, nothing about that can be faster to load with wasm.
Deranged40@reddit
Well, that's what needs to be rendered. But in order to render that image and hyperlink, most websites utilize Google AdSense or Facebook's ads, both of which bring in performance-hogging code that slows your site down before that image and hyperlink is shown.
maqcky@reddit
All this being true, all the APIs you need to do what you mentioned are exclusive to JS (at least, for now), so wasm is not going to improve anything. On the contrary, it will make it even worse.
Terrible_Visit5041@reddit
That's a moronic take. Slow websites aren't slow because the coders cannot make anything better. Slow websites are slow because the users don't change to faster services. There is a threshold of "fast enough". As long as you're within fast enough, you get extremely diminishing returns for your work. The law of diminishing returns apply. And for what? When users stay? Why not code faster? Why not use a tried and tested implementation of a calendar, that you can add to your application in an afternoon rather than take a whole week to code up your own, only tested by you. Sure, the calendar library has some code regarding the Buddhist calendar, but that code is not invoked. It is downloaded, though.
If you want faster apps, stop using slow once. And for this goal, first step, leave Reddit. That's fucking slow.
Sudden_Panic_8503@reddit
You are being downvoted because you are on a programmer subreddit, and you clearly are not a programmer.
Terrible_Visit5041@reddit
I am a programmer. I doubt you guys have been programming for a living.
EveryQuantityEver@reddit
Wrong. They are slow because the developers don't make them better.
Terrible_Visit5041@reddit
Alright, let's play your logic game.
Developers are not making the website faster. That's true. But why not?
Because they refuse to put work into it or because some C-level tells them something else is more important? I guess it is because the C-level tells them something else is more important.
So, why does the C-level not see how important that is? Because users are using it, even if it is just fast enough and not fast. Alright. Why are user still using it?
The developer has no incentive. He wants to get paid. There is no pay in working against his C-level. The C-level has no incentive, after all, faster does not bring in more money. The user has an incentive, after all, you're here bitching about it. You apparently would love to see those websites faster. So, put your fucking money where your mouth is and get off reddit and only come back when they sped it up. Hypocrite.
bildramer@reddit
While I agree C-level minds can't handle the idea of counterfactuals, "putting your money where your mouth is" is impossible because of strong network effects. A website isn't like flour, it's more valuable to users if it already has many users.
Terrible_Visit5041@reddit
Sure, always an excuse that it wasn't your responsibility to do anything. Not even voting with your wallet is your responsibility.
bildramer@reddit
It's not an excuse. Of course I avoid modern webshit personally. I really want less of it so I care which methods to fight it actually work, instead of doing the intuitive thing and hoping for the best.
Terrible_Visit5041@reddit
And the method you settled for is still helping to create content while bitching about people doing their job rather than doing what you'd like to do?
VeryOriginalName98@reddit
Interesting take to blame the consumer who has no choice, for not choosing a better service that doesn’t exist, instead of the people choosing to make shitty sites that force people to stare at adds longer, because that’s the business model that works when the company has no moral standing.
Terrible_Visit5041@reddit
Moral standing? Give me a fucking break. We speak about a website that takes a few seconds to load rather than child slaves.
makonde@reddit
You are correct, not only do they not leave but I think nore crucially they CANT leave for a lot of services, a lot of things simply dont have real competition and people have a lot of lock in so speed only becomes a problem when its really poor otherwise its better to focus on features that will drive revenue or attract new users.
Especially true for business to business software that costs 10s or 100s of thou a year that has been negotiated up front. Even consumer things like Netflix, YT, Gmail, even Amazon have a lot of lock in.
Flubadubber@reddit
Kinda surprised to see this getting downvoted so heavily, this feels like a pretty reasonable take to me. Maybe some of the haters haven’t worked on a large commercial product before.
The purpose of a commercial website is to serve a business/client need. If you already have a website which fits that need and is fast enough for 99% of your users to not care, these optimizations really aren’t all that necessary. Prefetching is a nice optimization to have but I’m not gonna prioritize implementing it over client-facing bug fixes or feature requests, and trust me there are thousands of those in our backlog too. “Implement HTML prefetching” sounds like a low priority ticket that will unfortunately never see the light of day due to the realities of operating a business.
If the business doesn’t care, and the users don’t care, and it doesn’t impact maintainability, then I probably won’t put extra hours into doing it
lowbeat@reddit
change nickname to terrible take
Terrible_Visit5041@reddit
People don't need to like it, but it is the reality of software development.
davvblack@reddit
yeah i don't know why you are getting downvoted, but "website a little slow" isn't a business problem. "website so slow people are leaving without checking out" is a business problem... but that's way slower than what people are talking about here.
this is something we're struggling at at work right now, thinking of "fixing latency" as an engineering-driven problem isn't right, it's not up to engineering to decide how much each millisecond is worth in person-hours.
argh523@reddit
Technically correct, but in practice, it leads to "It works on my machine". Unlike programmers and management, your customers might use inexpensive 5-10 year old hardware, and I can tell you from experience, you can really tell when something is classic serverside rendered, and what is barfed up by a JS framweork
Terrible_Visit5041@reddit
What utter bull. I said, they are slow because users don't change just because they are slow. I told users to play a game that is better optimized instead of buying a new computer. This complete misreading of my computer is a level of reading comprehension that is actually brain dead.
But hey, you're on reddit. You know how slow it is. Even as someone suggest, use old reddit. And guess what, that's still slow. People are on reddit, not admitting that they are exactly that kind of users that are responsible why our internet is so slow.
swan--ronson@reddit
Surely slow websites don't meet that "fast enough" threshold you literally described?! What a contradicting, rambling stream of nonsense.
Terrible_Visit5041@reddit
Pretty much most are hitting the threshold of fast enough. I don't know if recall the study correctly, but I think it was around 2-3 seconds for initial page load. So, no I do not contradict myself. Funnily enough, so very often a responsive waiting system will make everything better. Add a spinner and people will think the website is fast.
I understand why you all dislike that reality that better engineering is not rewarded. I dislike it, too. But that's just not how the world works.
BanAvoidanceIsACrime@reddit
"Everyone else is serving slop, so why don't we serve slightly better slop? We are here to optimize money, and nothing else" Spoken like a shitty manager
gmes78@reddit
old.reddit.com exists.
Deranged40@reddit
So you're telling me that there's more to making a website than 13
npm installcommands and 4 lines of code to call one of those libraries?uCodeSherpa@reddit
I mean. Come on dude.
Prefetching and cached SSR is doing 99% of the heavy lifting here, and both are utterly trivial to set up.
Nothing here takes significant developer time (it probably takes less dev time than crazy SPA techniques). Nothing is absurd or overwhelming. Nothing is stuff that web developers should not know about. The ONLY difference between your comment and the developers of this site is that they care.
Worth_Trust_3825@reddit
The original comment was pretty sarcastic.
Ok_Coast8404@reddit
Either the person replying has Autism or is a joker themselves?
Worth_Trust_3825@reddit
Yes
zelphirkaltstahl@reddit
With a traditional web framework, there wouldn't even be anything to set up at all. There is no SSR to configure, because that is how things work anyway. You render the template and you serve it. Maaaybe the cache, but that is maybe also configured on the reverse proxy, not the actual website.
Google__En_Passant@reddit
you forgot to mention how big your node_modules folder is after that, but it's understandable, you probably ran out of memory
dylan_1992@reddit
That’s because modern frameworks are designed with abstractions and separation of concerns so that you can throw more bodies at a problem and everyone can work in parallel.
To be super performant, you need tight coupling and that doesn’t scale well when you’ve got a massive team and anyone can break everything when they’re adding a feature.
SanityInAnarchy@reddit
This is sort of generally true, but I don't think it explains web frontends in particular being slow. Websites are fast by default. You don't need tight coupling to do that, you just need to not add absurd amounts of bloat that nobody asked for.
Most of the things on that second link are not a result of modern frameworks. Modern frameworks don't force you to use a constantly-looping video as a background image, or a three megabyte image at the top of an article that really should just be text, or to mix in so many ads that it doesn't matter how long your page actually takes to load because the user will be fighting to dismiss the interstitial and click through the cookie prompts to even find out how much it didn't load.
derangedtranssexual@reddit
I’m so tired of seeing motherfuckingwebsite.com, it’s such a stupid site
VeryOriginalName98@reddit
Rust has entered the chat.
SanityInAnarchy@reddit
IMO these should be required reading for frontend devs:
This is a motherfucking website:
The website obesity crisis:
VeryOriginalName98@reddit
Hang on, you mean to tell me that 20mb of JavaScript on every page load slows a site down more than pre-rendered static pages being served when requested? Next, you are probably going to tell me some nonsense about “hype” not being a good metric for choosing tools.
versaceblues@reddit
The website shown in this example is also extremely minimal.
Most modern bussiness need to balance feature and profit, along side performance.
Usually there is a threshold of “fast enough”, generally anything under 2s (for some cases under 1.5s). Where investing additional effort on performance offers heavily diminished returns
G_M81@reddit
I absolutely knew they were using a CSS image map as soon just from the thumbnail. Took my right back to 2006-2010 web performance stuff.
winowmak3r@reddit
It really is a website made for engineers by engineers. No frills, simple as fuck to use and they have just about anything an engineer would need to spec out a design or order a few extra parts for prototypes. I wish more of the web was as utilitarian.
smallballsputin@reddit
You mean, it was not made by react soy devs.
Familiar-Flow7602@reddit
Flash?
JustaDevOnTheMove@reddit
Spot on!
Civil_Inattention@reddit
It's super fast because they're constantly torturing the people who design the website. It's built entirely on hate, spite, and bad faith. And JavaScript.
t0astter@reddit
I mean you're not wrong. I interviewed there and it was the most depressing office I've ever visited in my life. Their employee policies are horrendous as well. They do pay exceptionally well but it's nothing worth the terrible policies they have.
Civil_Inattention@reddit
They are horrific people. It’s a giant Milgram experiment.
spinwizard69@reddit
McMaster-Carr is an amazing company! As a little kid, in the 1960's, I can remember my father bring home the "old" copy of their catalog from work. For a kid it was absolutely fascinating to browse through that big yellow catalog.
Today their web site should be considered as a bench mark for many E-commerce sites. For the type of person that already knows what they need, it is one of the most responsive sites going and more importantly fast. It simply doesn't get in your way when you are trying to get your work done. By the way in my mind there is a difference between responsive and fast.
It is great that an old school company can transition to the modern world in such an impressive way. Many other have failed or have been eaten up by Amazon or one of the others.
bwainfweeze@reddit
Imagine if the website people for MC had worked on Sears when it was still relevant…
spinwizard69@reddit
Actually that is a good point. I wonder if MC site is all done by internal development teams. I could see people (web site contractors) trying to sell Sears the latest and greatest updates to make their web site fashionable. Meanwhile over at MC there is likely a management team and software team working together to make the site a customer delight.
espeero@reddit
I believe they are. One person I online knew worked for them and had pretty positive things to say about the work environment. Nothing amazing, but a general belief that everyone was swimming in the same direction and just trying to make the company better. Pretty refreshing.
No-Ruin978@reddit
Sears made weird big internets bets to early eg. launching a ISP called prodigy; their big problem was being killed by walmart and such before ecommerce really took off
funnily enough I'd et MC and Sears crossover developer and staff wise since they are both based in the chicago suburbs
FatStoic@reddit
We're certainly in a weird place when a simple website that gives the users what they want and then gets out their way is somehow a revelation.
lashib95@reddit
Is it slow now? I just checked it today and can't see any html prefetching.
Agile_Wallaby_6251@reddit
I would just like whoever posted this to know that our tech department had a town hall about this tweet & youtube video and refreshed all of us on what we actually do to be so “Wicked Fast”. Thanks for appreciating our hard work. We all really appreciated it. -swe at said company
nicoconut15@reddit
That's actually very cool
memtiger@reddit
I love that they're doing all this and looking like an old 2000s era website design. It's like a sleeper car that's modded with a 600HP engine.
dzikakulka@reddit
It's honestly pretty sad that they have all the anchoring and nav bars and information sidebars and all... and it looks outdated because I guess they don't pad everything out with swaths of whitespace? Or add huge logos and hover popout photos? I wish modern design did not happen at all.
ThisIsMyCouchAccount@reddit
Modern design isn't the problem.
It's not designing for what it is.
Too many times the people calling the shots want to design a utility like it's not. Which is kinda what you're talking about.
You could totally make this look modern as hell with zero impact on performance. But the general layout and interactions would be the same.
alonjit@reddit
yeah, but the developers have the correct set of priorities: modern look is not on the list.
there's no reason to be. it adds nothing at best, detracts from the experience usually.
lookmeat@reddit
Yup, also in this space a skeumorphic approach makes sense: people probably have a paper catalog around, and having both have similar presentation and interface hints makes a lot of sense. Say you are at the workshop, you see a part that is broken, you look it up in the catalog to give the customer a price-range, they agree with it, so then you go to the office, into the website, and look for the same part to put the order in. You already did all the work of finding it with the paper catalog, why not allow you to reuse that knowledge? You see the same thing, in a similar place, instead of jumping to the page, you just click on the link: and it shows you the next content in the same time that it takes to flip to a page with a bookmark.
And honestly there's the part that we don't talk about modern design: it's meant to allow crappy and mediocre stuff. THe modern design is based on the idea that websites nowadays take too long to load. First you have a myriad of resources, and a lot of crud, tracking, spyware, animations, complex css, complex javascript second these resources are nested deeply: suddenly the js pulls in more css which pulls more images, and you can't know what else you need until you get the next. So what do you do? Well first you have this idea of "things fading in" with animations, so the time it takes seems intentional as glitz, and not an inevitability due to highly inefficient desing. It also hides that the website is doing way more than it should. To also hide the jankiness we add a "smooth" (read super slowed down, because you could add a gradient to make it smooth but also go as fast as the user wants, wait that's already the default) this again hides how slowly things are loading and the jankiness of the interface. The other modern anti-design is putting everything under the fold, you load a website and all you see is one massive logo, if you want to see any content you have to scroll, again this lets the website take forever to load and hide its mediocrity behind a lot of glitz. To hide the fact that so much of the website is white-space (because that would make it obvious there's something else going on) it somehow finds a way to make it feel cluttered again, even though so much of the website is empty.
alonjit@reddit
excuse me, that is called "clean design".
CrunchyTortilla1234@reddit
Modern design is absolutely the problem as it encoded the good UI practices as outdated and outright waste of user time and space as "modern"
ThisIsMyCouchAccount@reddit
When I say "modern design" I mean "not look outdated".
You can update the design and barely change the base UI. It's mostly an aesthetic change. Not a drastic upheaval of how you interact with the site.
You can be functional and look good. There is nothing that prevents that.
JewelerUnited@reddit
I agree. True graphic designers prioritize aesthetics, which can coexist with functionality. In my opinion, this site focuses on functionality but lacks aesthetic appeal.
CrunchyTortilla1234@reddit
I don't even think it's outdated. It's just basic
topological_rabbit@reddit
I order from this site at work at it's easily the most useful supply web page I've ever used. Blows their competition out of the water for how easy and fast it is to find exactly what you want out of a large catalog.
tennismenace3@reddit
For real. They have the "function" absolutely down even if their website doesn't look attractive. It's absolutely unbeatable to be able to quickly find any basic hardware item and have it delivered the next morning.
MrAnderson69uk@reddit
I guess in their game, online parts lookup, it’s the best way to get and keep customers, and so you have to be a cut above the rest of the templated store fronts - this may look dated, but it carries all the information you need in and uncluttered and consistent “parts catalogue”way, and I would expect a lot of the backend is customs built libraries on a legacy html framework - Naming conventions may have just followed a C# backend and no CamelCase conversions as I didn’t seem to notice any JSON data, just html.
In a project I worked on, for B2B sales application for Social Selling (basically research and know you prospective customer and contacts instead of the old ‘cold calling’ way), our site was built on MVC and the pages for the controllers actions had their own JS files. I also developed a MS Dynamics plugin that that communicated with a WebApi controller to get its skeleton HTML where each part when rendered with its embedded JS, would request its content and data from the web server over JSONP. Therefore each page had its own set of JS files - it’s basically a project organisation thing, as well as not transmitting unused JS for performance!
stahorn@reddit
Does someone have a good example of a similar web store but with this modern design? I'm terrible with recognizing different designs so I have no idea.
I tried the McMaster webside, and I have to say that it's not only super fast but also super user friendly. Imagine you need a strange screw that you forgot the name of, and then you just get this perfect webpage, go to "screws" and browse until you find what you want. You click it, and immediately you get to choose exactly which size of it you need. Click it, add as many as you want to your order. No extra design got in the way of what you wanted to do.
MrAnderson69uk@reddit
I think it’s designed for a different sort of user, one used to catalogues of parts, with an engineering background, and would know what a left-handed screwdriver or a long weight!!! lol
Hnnnnnn@reddit
or they could just design it better. where's the harm with that? are they out of time?
Loaatao@reddit
Why fix a website that isn’t broken? So some software engineers (who most likely have never worked on a car before) can say it looks good?
Hnnnnnn@reddit
i was responding to that guy, ask him why he's sad that they don't have good-looking design.
i can share my opinion but that is beyond the context of that conversation:
imo doing ths with actual good design would make this video better because it would show how "far" you can get with optimization, while preserving modern features.
GetPsyched67@reddit
It looks outdated because it looks like shit. That's about it really.
Loaatao@reddit
What’s the line for it being shit? Maybe if there were some buzzword colors and box shadows it would be better? /s
hx87@reddit
Modern design is fine, it's just overly optimized for small touchscreens instead of large monitors. IMO "mobile-first" design needs to die in a dumpster fire, and we should either go back to separate designs for mobile and desktop, or scale designs for larger screens to actually take advantage of all the available space.
PaintItPurple@reddit
I think there are a few reasons the page presented in this video looks outdated to many people:
Color choice. There are pops of bright colors but not really any color scheme or language there. This was very common in the early days of the web, but is much less so now.
A UI that at least visually resembles the kind of iframe-based layouts you'd commonly see around the turn of the century
The video creator appears to be using a browser that (I think?) has kinda ugly Windows 95-looking default widgets, which in combination with the other elements gives the impression of what it was like to browse the web a long time ago.
CrunchyTortilla1234@reddit
It's focusing usability over furious group masturbation session of graphics designers.
michael2725@reddit
As someone having a BS in mechanical engineering, the speed is definitely appreciated. I will say I do find that the UI serves a purpose. When you're using the website as a tool, you don't want to get turned around by a new layout/appearance. The website serves a purpose and serves that purpose well.
Extra_Programmer788@reddit
It's blazingly fast when compared to any badly optimised websites, otherwise it feels like a just a regular website that is optimised well, specially if you are browsing it from another side of the world.
Whoa1Whoa1@reddit
Also, they only need to load a 93 kilobyte image as a sprite sheet of their inventory on a page. The pictures are tiny and gray scale meaning they don't take up much space at all. Any website that wants to display one full color photograph that looks good will easily take up literally millions of bytes as even compressed high quality photos are at least 1mb. Example: a website that sells watches or laptops or whatever gadget is probably gunna need at minimum 5 photos or the website or listing will look sketch as hell. With McMaster, you just need one symbolic simple grayscale image and then the specifications like length, head type, thickness, and thread spacing is what matters and you know what you are buying.
Pseudoboss11@reddit
Watches, Black
https://www.mcmaster.com/5262N11
$449.98
Though this one does have 6 images on it. The images are all black and white, Unlike their popsicles.
hypoglycemic_hippo@reddit
Funny, both of those links load an empty webpage with a header. Put a loading circle in the middle and after it resolves, stay empty.
Website might not be that great after all. I am a standard Windows10 and Firefox, no plugins.
naikrovek@reddit
I can see them both fine, so you may have a browser extension or a proxy or something interfering somewhere.
hypoglycemic_hippo@reddit
Epyo@reddit
Have to log in to see certain products. I see a lot of e-commerce sites do that, I think it's a thing to stop bots from seeing certain things.
spareminuteforworms@reddit
Thats funny though because who gives a shit what the laptop looks like, I want to see the unobfuscated specs and thats about it.
Whoa1Whoa1@reddit
If you only have one picture that is grayscale and only is 100x100 pixels and it says "super deal, RTX 4090 inside" and you buy it, that would 99% of the time be a scam. There's no fucking way people would be okay with an item nobody has ever heard of that doesn't have a brand name or reputation where you just put one tiny gray image out for it and expect people to think it's real. That is fine for a fucking screw or nail or bolt. With one 100x100 image you won't even know if the thing has a damn touchpad or regular keyboard layout. Might as well just delete the photo at that point lmao.
spareminuteforworms@reddit
Its fraud or its not, I'm talking about the specs not the picture that gives me warm fuzzies. If you've got a simple brand say newegg I think you could spot a stock laptop image downsized to 4 bits as long as you had the documentation. Oh and don't forget to actually source the thing you are advertising.
Whoa1Whoa1@reddit
Well that's the thing, unless your audience already knows what the thing you are selling looks like and what it does and what your build quality is typically like, then you won't be able to sell any without establishing that first. If Dell or Asus or whatever company released a new model and said it was wildly different from their other laptops, but never posted anything except one 100x100 grayscale picture, yeah nobody would fucking buy it. Also 4 bits is one pixel with the ability to be black, white, or one of 13 shades of gray. Nobody is buying some shit they have no idea what it looks like just based on a description. People dont like using websites like eBay, wish, Alibaba, woot, etc, cause even though you get a huge wall of text about how amazing the item is, you have no idea what you will actually get cause you get one or two stock photos that may or may not even be the same as the thing you are bidding on or buying and unless you know exactly what the product number is and what it looks like or does, then it is a total crap shoot.
I don't even understand your Newegg idea. Are you saying Newegg could just put a 100x100 pixel grayscale pic of one generic laptop and slap it on all of their listings and be as successful? I don't think so. Not every buyer knows exactly what each model of laptop even looks like and they literally look at the side and back photos to figure out what the ports are rather than reading the full product specifications. You also kinda want to know sometimes WHERE are the ports on the physical thing, which is pretty much NEVER written in the specifications. Or where the fans are or if the arrow keys are the smaller skinny versions or if the numpad is jammed next to the rest of the keyboard, and so so much more that is much better found out by a picture. I don't want to read paragraphs explaining where the ports are or how the keyboard layout is or how they did the function keys or numpad integration or location of any volume sliders or special keys. That would suck. Again, that's fine for a fucking nail or screw. There is only like 5ish parameters to worry about with a nail. Material, thickness, width, length, tensile strength, made for indoor or outdoor, rust proof or not, etc. I don't really need a picture for a nail or screw other than a generic one. For screws you just need all that plus thread count, thread spacing, head type, etc. a generic picture is fine.
Google__En_Passant@reddit
Dude, do you even jpeg? 100kb is all you need for a full product photo, maybe 200kb if you really want to see the finest details.
ekdaemon@reddit
You don't need 4MB or even 1MB images to look good on a website. You only need the 4MB images available if someone clicks on one of the smaller images.
Here is the type of image you see on Amazon when browsing a random listing:
https://m.media-amazon.com/images/I/61t6XIxGQFL.__AC_SX300_SY300_QL70_ML2_.jpg
8 kilobytes.
And here is the "high resolution zoom" version when you click or hover on the image:
https://m.media-amazon.com/images/I/61t6XIxGQFL.AC_SL1500.jpg
77 kilobytes.
SavingFromRyan@reddit
do you like that coffee machine ?
cvak@reddit
I bought it 2 weeks ago, it’s very good imo.
Perfect-Campaign9551@reddit
You are so seriously underrating this. It shows why server rendering should be the way to do things, scalability be damned, companies just kept pushing the CPU on to the browser and our online experience is shit because of it.
Extra_Programmer788@reddit
Absolutely agree with that
smallballsputin@reddit
Its webscale!
PFCJake@reddit
Made me think of ofc.nu (swedish fishing store). It's blazingly fast. Made in C running on slackware.
LloydAtkinson@reddit
The funny thing is this is ASP.NET, so C#, .NET etc. The second funny thing is this is actually pre-open source and cross platform .NET, meaning this is running on Windows and IIS, so around .NET 4.5 - .NET 4.8.
So, what this means is that should they upgrade to a more modern .NET version made in the ~ten years since 4.5-4.8 was released, their site will become significantly faster still. .NET has had huge performance improvements, it's great.
Finally, the last funny thing is that the usual crowds would be drooling and shitting their pants about "hurr durhhh .NET bad, not open source, C# is only windows". So yes, something running on Windows is beating the average website experience by 10x, and there is yet even greater performance available should they upgrade to a newer .NET running on Linux.
Meanwhile some hype wave chaser would suggest "they should just use Next and get SSR reee". Well, it is literally already using SSR.
naikrovek@reddit
how can you say this is standard asp.net? other than the names starting with a capital letter, which means nothing in reality (some people just name things this way) I don't see any actual evidence of .net framework being involved.
pheonixblade9@reddit
I think a few hours or days might be underselling it - I was in charge of upgrading much of TFS/Azure DevOps to .NET Core and it was a pretty significant effort.
Blecki@reddit
This is how I feel writing websites in coldfusion. The older technologies just work but they aren't "cool".
madworld@reddit
They are very hard to maintain though. One security bug in one package might require upgrading many packages, which turns into dominos pushing you towards modern tech.
Blecki@reddit
No? These stacks are still maintained by the vendors. Cf in particular is all-in-one. There are no dependencies. Frankly, the modern shit could learn from them.
brigyda@reddit
I don't go here but it showed up in my feed, and...what the fuck happened in this comment section?
illustrious_trees@reddit
Another major reason for it being fast: no ads/analytics. Not having that crap alone makes it so much more faster.
Matt3k@reddit
There's calls to trk.aspx on every sort of actions, browser resize events, product views, searches. It has quite a bit of tracking!
While frontend performance is important, and inlining your CSS is great and all, I believe the real secret to this site is in their backend. Which of course, a youtube video isn't going to be able to observe.
IAmTaka_VG@reddit
Yeah I’m an asp developer and their front end may look like it’s made in 2001 but I guarantee their backend is exceptional and I’d actually pay money to look over their code.
Their middleware must be insane, as like he says, there is some sort of service dedicated to serving JS files and aggressive caching. I’d love to see how they deal with frequent price updates of products as they cache almost everything by the looks of it. The sprite trick is brilliant and I’m stealing that for assets that don’t change often.
dhiltonp@reddit
I'm not sure they do frequent price changes?
McMaster Carr is tailored for engineering/prototyping. You're not paying for the part as much as you're paying for the service around the part.
Parts have full specifications - dimensions, physical properties, and standards the parts are certified for, plus 3d models that can be used directly in CAD.
And it will show up the next day, during business hours.
In software terms, it's like having a faster build system. The business gets more value from their engineers by increasing productivity.
chmod777@reddit
literally a conversation i had last week.
PMs: why is QA getting better lighthouse scores?
me, after looking at the network panel for 5s: because you are loading 6 different trackers on Prod.
IAmTaka_VG@reddit
Only 6? Where can I apply. Marketing has us put duplicate services in.
No one can explain to me why we need Microsoft clarity AND hotjar. Like pick a fucking product.
daerogami@reddit
What's crazy is some of these analytics services let you share data with other analytics dashboards. So you don't have to load several of them! Most of them integrate so well because that's exactly what they're there for, free data.
nyctrainsplant@reddit
I hate analytics as much as the next guy, but you can totally have a fast webpage filled with them. The other fast example site I see a lot is USAtoday (also very quick) and they have plenty of analytics scripts from what I recall.
Kinglink@reddit
Now if only adblock and blocking cookies would get us the speed back.
reddit_man_6969@reddit
I highly doubt an e commerce website would have no analytics
illustrious_trees@reddit
try it? I don't see anything showing up on my browser console, and nor is my adblock detecting anything out of the ordinary.
MaliciousTent@reddit
Some analytics can be done server-side.
oursland@reddit
I bet that is what is going on here. Client-side analytics is useful if you anticipate the server developers are not particularly skilled.
MaliciousTent@reddit
So you are saying ya get what ya pay for ?
IAmTaka_VG@reddit
Once again it looks like in-house analytics. The developers of this site are very old school mentality.
pyabo@reddit
Love it. It's one dude and he's been in the job for 18 years.
devolute@reddit
My pro tip is using an analytics package that is much less creepy than Google Analytics.
reddit_man_6969@reddit
My pro tip is to use something less shitty and render-blocking than Adobe
yawaramin@reddit
Well, you can check for yourself by looking for analytics scripts in the Network tab.
Matt3k@reddit
Check all the calls to trk.aspx. It's on everything and has no impact on performance. Why would it?
UpsetKoalaBear@reddit
They have no tracking/ads because the type of people that use the McMaster Carr website on the daily are not the type of people to make impulse purchasing decisions.
People are forgetting that it’s a business decision. It’s not because they “care about performance” but it’s because they don’t have a need to upsell the people who actually use the site on a daily basis.
The type of people that use the site are specifically going on there to order a specific thing. They’re not buying for the sake of buying and quite often they’ll be purchasing for their job or work.
As a result they use a company card or similar that they’ll be scrutinised for when the company does an audit or similar. So you can’t “upsell” them because they’re not going to try buying anything else other than what they need.
It’s a very specific website with a very specific purpose. This extends to most other distributors for stuff like this, farnell for example is quite fast (after initial load) as well compared to 99% of commerce sites, and they have tracking/analytics. Another example is metals4U in the UK who distribute metal stock.
These companies have just realised that their target demographic quite literally doesn’t or can’t buy anything else because they’re on the job and buying through a company or such.
intermediatetransit@reddit
The problem with analytics isn’t so much e.g GTM or whatever script you’re loading. It’s the mistake of handing off the reigns to non-technical people in marketing who have no clue about websites or website performance. And they will randomly add whatever shit they want, dragging the website down.
I’ve seen this in countless companies at this point.
bonnydoe@reddit
the big disappointment when you deliver a site with good speed and the marketing department comes in :(
Ohnah-bro@reddit
It’s why good speed is so important in the first place. So people who have no idea what they’re doing can ctrl v ad tags into google tag manager.
kidman01@reddit
This should be much higher. It‘s actually not that hard to make a website fast. Society has gotten used to slow websites (due to a number of reasons) that a perfectly normal one can stand out as fast 😅 (I don‘t want to downplay the efforts that went into making mcmaster, it‘s super nice and pleasant to use)
Selentest@reddit
Very true
devmor@reddit
Just server-side templates and no dynamics where they aren't needed will get you 90% there.
smallballsputin@reddit
HTMX gets you really far, and use JS for that last 10%
devmor@reddit
HTMX is a JS library.
KalilPedro@reddit
no shit
devmor@reddit
"A JS library gets you really far, and use JS for that last 10%" is an odd statement.
KalilPedro@reddit
Htmx could be an java applet on the browser for all you care, you don't interact with the js guts when writing your own code, you just add markup to your html. It's like saying the browser is not platform independent because it has a windows implementation. Heck, the Dom is in C++ and you don't say you write C++, you write HTML, the HTML is the interface for the Dom, and the html is the interface for htmx
devmor@reddit
My original point was that adding needless frontend libraries is exactly the cause of modern websites being so slow.
The majority of the things you would use HTMX for are not necessary features of a website.
KalilPedro@reddit
👍
KalilPedro@reddit
Htmx could be implemented in the browser using C++ and it wouldn't make a difference on how you use it
intermediatetransit@reddit
I do client side heavy websites for a living. React, Vue etc. Even properly optimised they are all garbage and the DX of supporting them long term is insane. This whole way of building websites needs to go the way of the Dodo. And I that as someone who makes a living doing it.
devmor@reddit
I feel you! I'm currently working on a PWA that would have been done months ago and been faster and more responsive if it wasn't front-end heavy.
But users are apparently allergic to page loads or something.
Raunhofer@reddit
I assume you meant client-side application, not progressive web app? I personally hope PWAs catch on, as that would undermine the Google/Apple app store monopoly.
devmor@reddit
Those are not mutually exclusive terms.
Raunhofer@reddit
Sure, just that being allergic to page loads doesn't really mean anything for PWA.
intermediatetransit@reddit
It’s all just naive assumptions at this point I suspect. We don’t need all of this.
The problem is that there are multiple generations of web developers who are unable to do anything without these types of tools.
kh0n5hu@reddit
Come to the dark side. Use Go. We got cookies here.
daerogami@reddit
I do romanticize the ideal of being able to build static html with minimal css. That being said, I really love working in Angular. I still maintain a site using its predecessor, AngularJS, in an SSR environment.
Raunhofer@reddit
I initially intended to argue against your claim, as modern tech stacks and methods can achieve ridiculous performance with minimal maintenance.
However, as I considered the rapid pace of technological advancement, I realized that comprehending these concepts and libraries has become increasingly challenging. This complexity may have contributed to a situation where seasoned web developers find themselves needing to relearn fundamental skills, potentially leading to a decline in the overall quality of web code.
Not to mention ads, ads make everything wonky and slow.
stlcdr@reddit
Exactly this. By creating application specific controls on the server then just pushing out the resultant HTML you will have a stupid fast rendering. One reason i use ASP.Net (.net framework) and C#, but it is not the only solution to do this.
devmor@reddit
This is also one of the reasons Laravel maintains such a hold over the web development ecosystem.
It does offer first class support for Vue these days, but it shines as a Framework for delivering server side templates as fully rendered HTML.
MarvelousWololo@reddit
I’m not familiar with Laravel, but isn’t it the same for similar web frameworks like Rails or Django?
devmor@reddit
Yes, I believe it was heavily inspired by Rails, in fact.
yawaramin@reddit
Can someone summarize the video?
Lonsdale1086@reddit
Preloading of pages when you hover a link
All images the same dimensions so the page doesn't shift when the images pop in
JS lazy loaded
Using "pushstate" to change pages instead of reloading
Caching
Everything's server rendered.
Paraphrasing the tweet the creator of the video made on his twitter account @wesbos
Essentially when you hover over an item, it fetches the whole pre rendered content for that page, then when you click it it swaps out the content of the page dynamically instead of loading the page fully.
GXNXVS@reddit
so it’s just push-state ajax, somethibg framewirks like nextJS have had for years…
reddit_man_6969@reddit
Next.JS provides the preload link functionality out of the box, right?
AndriyIF@reddit
That doesn't sound like something unique. Many Ruby on Rails sites used (and still use) all listed things. If I recall correctly, tech called turbolink
jcGyo@reddit
To add to this, (not sure if it's mentioned in the video because I can't watch right now, wish people would provide readable content alongside a video) they are able to cache so aggressively because their content is read from often but only written to occasionally. As opposed to a website like reddit where the content is changing all the time.
lRavenl@reddit
I wonder how much of this is appliccable without sacrificing (some) maintainability of your frontend codebase?
masterofmisc@reddit
In a nutshell
Basically they really care and have put a lot of thought into performance
davvblack@reddit
and that's just the stuff we can see! probably some interesting layers behind the scenes too, layers of varnish or something, and surprisingly i would actually guess no cdn.
Brillegeit@reddit
They're using Akamai and possibly some of their proprietary and expensive magic. Things like Akamai ESI can really speed up certain sites.
MarvelousWololo@reddit
Close! I think it’s mentioned on the video they are using squid.
lRavenl@reddit
How much of this performance optimization is possible without sacrificing (some) maintainability of your frontend codebase? I imagine using jquery and tailoring the javascript for each page can get hairy with a complicated app (instead of using a framework like vue/angular/react)
TravisJungroth@reddit
I imagine they’re not trimming the is by hand.
yawaramin@reddit
Thanks. One question, is CSS not loaded before the body when you have it in a
<link>tag in the<head>?jcGyo@reddit
What they mean is the CSS is loaded on the same request to the server as the HTML file rather than burning 50-100ms to make a second request. Normally this would be a disadvantage for loading subsequent pages but because they use javascript with pushstate rather than a full page load when you click a link there would be no advantage to your browser being able to cache the stylesheet separately.
endr@reddit
Nice, sounds like all the stuff Sveltekit does
argh523@reddit
Server-side rendering and lot's of little old optimization tricks make a fast website
wakojako49@reddit
dey make it rain wid dat cache
stking68@reddit
without watching i would say jQuery and no non sense JS frameworks ...
Cokemax1@reddit
you were half right.
LloydAtkinson@reddit
It's not 2014 anymore bro.
Merry-Lane@reddit
FYI vanilla JS is way faster than jQuery.
dlampach@reddit
I always marveled at the Mcmaster Carr catalog. Somehow this doesn’t surprise me. With inventory like they carry you have to be on top of your game.
wildjokers@reddit
A video is a horrible way to present this info. Is there a written summary somewhere?
Hedanito@reddit
"174ms to render is so fast!"
Cries in gamedev
h4l@reddit (OP)
The website in question, it is impressively snappy! https://mcmaster.com/
ShetlandJames@reddit
Some pages have dogshit speeds. This one sits for about 6 seconds on a throbber for me
topherhead@reddit
Thinking that might be a cache miss. Since a bunch of traffic is being sent to it now I'm guessing Akamai fetched it because when I clicked on it it was instant.
uCodeSherpa@reddit
Instant for me. Clicking through several products was instant. I saw the loader image appear but it was brief. My email takes longer to open on a fresh chain.
DM_Me_Summits_In_UAE@reddit
https://www.mcmaster.com/products/bearings/
Yeah search seems to be their Achilles
LetrixZ@reddit
I see constants loading screens when navigating pages and products
faberkyx@reddit
don't know really.. looks like nothing out of the ordinary, I get some white pages for like 5-6 seconds.. no skeleton, no loading state.. UI wise looks pretty poor,
xvermilion3@reddit
I don't know it feels normal to me. It doesn't feel "impressively" snappy
BigTimeButNotReally@reddit
Either you have a slow internet (see below) or you are in denial, and suffering from jealousy.
xvermilion3@reddit
lol! I have 500 Mbps internet. It's fast but not "impressively snappy"
BigTimeButNotReally@reddit
Cope
FuckOnion@reddit
I have 50 mbps of bandwidth. Sure, these days that's slow but surely an "impressively snappy" website should be snappy with that?
What I'm experiencing is loading indicators on every single click. That's not great UX to me. I'm not jealous. I literally think I could do better.
felipeccastro@reddit
It's probably related to your location, if you're near the server it would be snappy (which is the value proposition of edge deployment products like Fly.io for example)
belkh@reddit
The website prefetches data with a 2s timeout, if your internet is slow it won't prefetch and it'll just work like any other website, open the network tab and see if the prefetches are working for you.
When it does, it acts as if you've already prefetched all products they have, making all navigations quick
Agoras_song@reddit
What do you mean by a 2s timeout? Could you point me to somewhere on the page/a tag they use? I am very impressed by their website.
-Hi-Reddit@reddit
The state of the r/programming subreddit these days, people asking what a timeout is...Fml
dwerg85@reddit
Maybe people are here to learn. Don’t be a dunce.
belkh@reddit
Send request to prefetch data, timeout if it takes longer than 2s, this can be done frontend in js or server side.
It's not complex, the timeout is not why it's fast, it's why it's not fast for some people.
The website just: "oh you're on the categories page? Ill prefetch the first page of each category" and to not keep those requests hanging times out pretty quickly
Agoras_song@reddit
That makes sense. On further inspection, I noticed it prefetches any link that your mouse is hovering on.
Plank_With_A_Nail_In@reddit
That's normal though.
haskell_rules@reddit
Is it "normal" though? Outside of a few companies that actually care about optimization and user experience, my personal experience is "develop an MVP as fast as possible and move to the next project".
belkh@reddit
There's also cases where it's impractical to prefetch because you can't afford 10x the requests per user on your website
belkh@reddit
I'm explaining why it's fast for some and slow for others
random8847@reddit
Same here. Just feels like a regular website without too much JS.
argh523@reddit
You really notice the difference between this and something built with client side javascript frameworks when you regularly use 10 year old hardware
lppedd@reddit
This tells you about the garbage people are used to consider normal nowadays
iamapizza@reddit
I don't even say hello to the world without 800 MB of react.dumpersterfire.js
Plank_With_A_Nail_In@reddit
This website does literally nothing though, its just lists of stuff.
nelmaven@reddit
It sells products.
potatoespud@reddit
For those interested in their range of products, they are great to deal with and have great delivery lead times on the East coast if the US. Great for SS nuts and bolts..
Hydraulic_IT_Guy@reddit
Ah, they're masters of reddit advertising
damagednoob@reddit
I don't understand how sprites are an advantage anymore. With HTTP/2 and multiplexing, wouldn't that be just as fast to download and render? I would also imagine it would significantly decrease the complexity of adding, modifying and removing images on a page
bwainfweeze@reddit
I suspect this depends on whether you’re using a CDN or not.
JPowTheDayTrader@reddit
TIL this company is in LA
learnactreform@reddit
Cool
Twirrim@reddit
If you really want to see impressively snappy, https://lite.cnn.com. That's how fast things could be if we stop overloading it with javascript, hundreds of calls to microservices, etc. etc.
LetrixZ@reddit
Thats only text on a white background.
Twirrim@reddit
Look how fast it goes, how quickly they're able to return content to the viewer, present it fully rendered.
blckshdw@reddit
I think that’s the point
masterofmisc@reddit
Wow. Its amazing that they are using the Yahoo yui library and Jquery. Old tech can still get the job done!
moduspol@reddit
I miss jQuery
soundgravy@reddit
It fullfilled a purpose at a time it was needed. Today, jQuery is an absolutely dreadful choice with so many other better tools.
FlatTransportation64@reddit
Why? I see this sentiment repeated pretty often but no one seems to go into details about it
starm4nn@reddit
I'd say Jquery had two value propositions:
Crossplatform code, which is kinda not necessary now that Safari is the only badly behaving browser you'd wanna support. And even then, it's mostly edge cases
Genuinely useful apis. Like the jQuery function itself (sometimes called by the dollar sign) was great when document.querySelectorAll didn't exist or wasn't reliable.
I think Jquery had plenty of good ideas, as evidenced by their adoption by vanilla JS.
popiazaza@reddit
Using pure JS giving better performance becuase jQuery is a bloated translation layer.
Nowadays we also have a lots of bundler/compiler that could optimize for the best performance with the smallest size.
civildisobedient@reddit
I haven't looked at it lately but typically there are compatibility checks that will use native methods if you're on a browser that supports them. That's a pretty fast operation.
But I just don't see why it's is even necessary these days since you don't see the kinds of crazy compatibility issues we used to have a decade (or more) ago. Maybe if you have to support some ancient government system that still runs Windows 95.
FlatTransportation64@reddit
Pure JS sure, but the user above was talking about tools
popiazaza@reddit
Hmm. Bundler/Compiler isn't a tool? It compile from whatever framework it use to a native pure JS.
Just all the IDE (is this a tool?) to write JavaScript alone is worth make writing pure JS much easier that you don't need jQuery anymore.
You could use something like React framework that brings lots of functionality, but is bloated.
Svelte for best of both world, or any other choices. Really, there's so many tools out there.
calsosta@reddit
Most of what jQuery can do, can now be done with native JS. Also, it doesn't provide enough to do what modern app frameworks can do. So for me it is either too much or too little. There might be performance or interoperability issues, but I never saw them.
Even so, part of me wishes we lived a steampunk-like alternate reality where jQuery and KnockoutJS became the predominant libraries for building apps.
More-Butterscotch252@reddit
We have htmx now, because... well, I don't know why.
yawaramin@reddit
htmx pushes you towards doing almost everything that the McMaster-Carr website is doing. Server rendering, history push, loading indicator–they even have a prefetch extension. This website is almost like a showcase for htmx.
TrevorPace@reddit
Pretty much. For some reason it's like when dynamic DOM manipulation frameworks came out we all forgot that we could send pre-rendered HTML. HTMX is just that but asserted by attributes on HTML components themselves. Which generally feels easier to use than writing javascript to do it explicitly.
That's the same reason Tailwind has become so popular. People could just write attribute-based CSS or custom classes/ids all the time, but it's just easier to add a class to the class list of an element.
stlcdr@reddit
It’s still available, you know. Nothing stopping you from using it except other people’s animosity.
triggeron@reddit
I've used McMaster my whole career. The website is PERFECT, nothing else even comes close.
d33d4y@reddit
Bookmark
Selentest@reddit
It's not "wicked fast", tho
superdirt@reddit
Can you name another e-commerce site with a largest contentful paint under 174 milliseconds at the 75th percentile?
campog@reddit
rockauto.com
Selentest@reddit
Should I? The consensus seems to be that it's blazingly fast in general, not blazingly fast for an e-commerce.
superdirt@reddit
If you want to move the goal posts to a site with less complexity, go for it. Name some.
Selentest@reddit
Move the goalposts? You're seem confused.
superdirt@reddit
There are engineers who have experienced the challenge of optimizing LCP, then there is you.
seanluke@reddit
Craigslist maybe?
MarvelousWololo@reddit
Another great example. I love Craigslist.
MarvelousWololo@reddit
People claiming it isn’t impressive might be navigating a different web than me
Swimming-Cupcake7041@reddit
Other sites being slow doesn't mean this site is fast.
callmelucky@reddit
It literally does jfc
reddit_man_6969@reddit
Usain Bolt isn’t that fast. 9 seconds is a pretty big number. 100 meters isn’t that far.
Waryle@reddit
Considering that the very notion of being fast is entirely relative, yes, it does exactly mean that.
gold_rush_doom@reddit
It kind of does
agumonkey@reddit
Yeah it's a good ok+ but it's nice to see a large sales platform with a simple and performant structure
jo1long@reddit
Cool presentation, informative.
dtwhitecp@reddit
the fact that McMaster is used almost ubiquitously in the US and has been for decades, yet everyone still likes and depends on them, tells you they've got some smart people on the payroll including good management to not pull it into some dumb direction.
wildjokers@reddit
Screw them and their overpriced shipping though. Sometimes shipping is as mush or more than the stuff you buy. Although their customers are primarily businesses who don't normally care about shipping costs. They don't cater to retail buyers. But they can be a good source of otherwise hard to source stuff, like G10 (garolite) which I use as a print surface for my 3d printer.
Darwinmate@reddit
Holy hell that was extremely informative.
So much new info for a newb like me!
mattsmith321@reddit
I worked on a B2B commerce site back in 2001-2003 which targeted auto repair shops ordering parts from their local parts store. First time the owner went out to a test customer he reported back that the site was too slow. Turns out the shop computer was only getting about 5-10k down on their 56k modem due to the ruralness and splices in their lines at the shop. We ended up having to do a lot of cool things like XML data islands and XSLT transforms in IE before AJAX was even a thing. I also had the backend guy pre-render some specific chunks of data so that I could download it directly and then cache it. I had a blazing fast Year-Make-Mileage-Engine selector back in the day. Good times!
deong@reddit
The amount of time I’ve spent in my life refining a YMME selector…
We had to modify the internals of the Qt combo box because we needed keyboard handling to work like an old green screen app. A normal combo box filters dynamically based on character matching. Type “ch” and it’ll narrow to Chevrolet, Chrysler, etc.
We wanted to maintain numeric codes for makes that existed in the old application. Like 1 for Chevrolet, 2 for Ford, etc.
So we hacked in a lookup table so that you could either type “F” and have it filter like usual or you could type 2, and it would select Ford and move focus to the model control.
Probably still the worst code I’ve ever written.
mattsmith321@reddit
Oh yes, I love how thick clients allow you to type several characters to narrow down the selection. I hate how web browsers don’t.
geeeffwhy@reddit
mcmaster has generally been pretty committed to a high quality consumer experience. they had an excellent catalog before websites were the de facto source. that’s the reason behind the reason.
mck1117@reddit
They still have the amazing print catalog, and plenty of people still use it!
geeeffwhy@reddit
they used to have a great catalog. they still do, but they used to, too.
DacMon@reddit
And you can return literally anything. Simply send it back and you get a credit.
Fantastic service.
testfire10@reddit
Everything about McMaster carr is amazing. No surprise to me their website is killing it too.
ComputerWhiz_@reddit
Watched this yesterday. It was a really in-depth and fascinating case study.
MarvelousWololo@reddit
Fr, super entertaining. I could watch such videos for hours.
ComputerWhiz_@reddit
I really liked this one about potting apps to Windows 95 if you are looking for something somewhat similar: https://youtu.be/CTUMNtKQLl8
MarvelousWololo@reddit
Awesome, thanks!!
Raunhofer@reddit
Usually sites that are information focused, as Internet used to be, tend to be fast — no matter the stack really.
st4rdr0id@reddit
I was expecting some no-JS minimalistic website but it is actually a .NET ecommerce with about 10 JS files (2.7 MB the heaviest one, but it takes 500 kB compressed). The browsing feels fast once loaded though. Images are < 100 kB each.
I wonder if it could be made still faster by switching to fully static pages and enhancing the caching. But it is already fine as it is.
AndrewNeo@reddit
when I saw this come up on my youtube recommended I thought it was going to be about their shipping, because I swear their stuff arrives so fast they had to have shipped it before you ordered
zelphirkaltstahl@reddit
In what world is that fast? ...
I looked at it in network inspector. The last request finished after 6.78s. OK, I guess that was probably some kind of delayed request, sent after other stuff got loaded. Maybe. But even if I look at other timings, lots of stuff, half of the requests actually, still only being sent after the 1 second mark, and then of course still taking time to be finished. There is still lots of request handling going on up to approximately 2.5s or even 3s.
... oh wait, I know in what world it actually is fast! In reality, where we have a web more and more consisting of bloated JS crap websites "apps".
As a comparison I made a basic static website, simply putting that on a VPS server I rented. It finishes after 362 ms, no cache of course (cached: 272 ms, but could be faster, if informed the browser, to cache HTML and CSS), with pictures and CSS all loaded. I basically have zero optimization there, no CDN, no nothing.
uCodeSherpa@reddit
It is fast in the days of Gmail taking 13+ seconds to get you in to your email (when new Gmail first launched it was almost 30 seconds).
For my part, I am just glad to see tech at least somewhat FINALLY beginning to look at perf again, even if “fast” is still “slow”.
ayushmaansingh304@reddit
https://www.serenity-ui.com/
__some__guy@reddit
Absolute garbage.
LloydAtkinson@reddit
Really? advertising spam?
YesIAmRightWing@reddit
the speed defo makes a difference
when traversing through websites where am trying to find parts, am usually trying to fix a problem, by ordering parts
i dont give a shit about fancy css or the rest of it. i just want my god damn product.
which is why i appreciate the above.
bwainfweeze@reddit
While worrying your boss or the customer will think you’re fucking around on the Internet.
bwainfweeze@reddit
Website designed to be used on 3G networks standing inside your work van. So of course it’s fast af on gigabit.
dirty-sock-coder-64@reddit
noob question, what kind of ecommerce frameworks does it use for payment gateway?
threy do have payment method option, how did they add that?
As i remember, stripe can be integrated in websites, and wordpress has integrated multiple payment gateways, including stripe, but that all i know.
777777thats7sevens@reddit
It's been awhile since I bought from them, but when I did, it seemed like payment was handled asynchronously. You'd put your credit card number in, and the order would go through immediately, no verification. If there was an issue with your card, they'd reach out to you after the fact to send them updated payment details, or they'd send you a bill.
Most of their bigger customers aren't paying by credit card anyways -- they have a line of credit with McMaster and settle the bill monthly or however often.
Their whole shtick is making it extremely fast and easy to buy things from them, no friction. Everything is shipped the fastest way possible -- in major manufacturing areas parts can arrive the same day you order. They have a page where you can enter in part numbers and quantities so if you have a BOM you need to buy you can quickly type in what you want rather than navigating around the website to find things one by one. And if you don't know the part number for what you want, you can just write a description of it in the part number field and they'll find that thing and send it to you.
grumpyfan@reddit
Not relevant for this discussion. You can probably find out using a tool to diagnose their frameworks like builtwith.
reddit_man_6969@reddit
Not sure but you can load that stuff asynchronously. Nobody is paying for something in the first second on a page.
dirty-sock-coder-64@reddit
I just wanna know what are they using for adding payment methods
tRfalcore@reddit
Go almost buy something and see for yourself
dirty-sock-coder-64@reddit
i ain't buying nothing, even if i wanted, i'm not from US
WaitForItTheMongols@reddit
That's why they said almost buy. Go through the process until it asks for payment, then look at what you see, and abort the purchase.
DrummerOfFenrir@reddit
When a site I used to use daily shows up, haha!
Source: ex CNC machinist
reddit_user42252@reddit
Youtube could learn from this. Almost unusable on older computers.
biloser69@reddit
2024, webdevs discovering that people don't like loading screens
CowMetrics@reddit
The Factorio website is also silly quick
seekfitness@reddit
Small images, little data over the wire, no extraneous features. It’s not like this is magic, it’s just a simple catalog made into a website.
realstocknear@reddit
I am trying to make my own website fast (open source stock analysis) called https://stocknear.com/
Any tips or feedback what I can improve?
Perfect_Wall_8905@reddit
Went to the website. When browsing the products it was not snappy for sure. Some caching would help here.
idebugthusiexist@reddit
No typescript or web assembly? Not cool enough /s
leogodin217@reddit
Wes Bos is fantastic. One of the few people who make excellent paid courses.
evilbulk@reddit
I can do this today on Wordpress. It's not that difficult but I think it's not possible to achieve this fast loadings for every kind of website. Checked with Page Speed View, their and my client's site has score of 100 on Desktop.
Buzzard@reddit
That gets a score of 47 for mobile and 90 for desktop.
... which makes your landing page is slower than one of the bigger e-commerce sites on the internet. That's why it's impressive.