IBM used to make hardware specifically for applying XSLTs to XML files, used by finacial institutions and shit to process transactions and stuff like that. The good ol' days.
I had to get a certification for the appliance about 113 years ago. Never used it. IIRC they are or were called IBM DataPower appliances. Basically machines that were designed to be fast at hardware-level data validation, transformation, ...
One use case I experienced was a physical gateway machine for a bunch of on-prem 'ingest' type services. the gateway checked authn/z and validated SOAP payloads before passing on to another machine that would route to the appropriate service.
The constraint here is deployment, not expressiveness. I wanted something that runs on a stock jump box without introducing a new runtime or packaging a processor like Saxon.
XSLT 3.0 would definitely simplify parts of the grouping/aggregation logic, but at the cost of portability. In this case I kept the transform minimal and pushed the more dynamic logic into the browser instead.
Until you need to do something untrivial and then everything fells apart. I don't like solutions, which speeds up 90% of cases, because they are easy any way. I like general usage languages, because they make those 10% of hard problems manageable
In my previous project we had some XSLT enriched with some java code, because of course XSLT was not enough. It was just horrible
In the past XSLT has one huge advantage: it could be used from any programming language, which is a plus. Nowadays we have web assembly widely supported everywhere, which is more general and powerful
SwitchOnTheNiteLite@reddit
IBM used to make hardware specifically for applying XSLTs to XML files, used by finacial institutions and shit to process transactions and stuff like that. The good ol' days.
Jolly_Resolution_222@reddit
What kind of hardware?
Infiniteh@reddit
I had to get a certification for the appliance about 113 years ago. Never used it. IIRC they are or were called IBM DataPower appliances. Basically machines that were designed to be fast at hardware-level data validation, transformation, ...
One use case I experienced was a physical gateway machine for a bunch of on-prem 'ingest' type services. the gateway checked authn/z and validated SOAP payloads before passing on to another machine that would route to the appropriate service.
Worth_Trust_3825@reddit
There are xslt tools that do not require browser. As a result, you can prerender your report.
13utters@reddit (OP)
Yes I prefer/suggest the use of xsltproc
elmuerte@reddit
Restricting yourself to XSLT 1.0 and XPath 1.0, while XSLT 3.0 is already 8 years old.
13utters@reddit (OP)
The constraint here is deployment, not expressiveness. I wanted something that runs on a stock jump box without introducing a new runtime or packaging a processor like Saxon.
XSLT 3.0 would definitely simplify parts of the grouping/aggregation logic, but at the cost of portability. In this case I kept the transform minimal and pushed the more dynamic logic into the browser instead.
Slsyyy@reddit
> That makes the pipeline simple
Until you need to do something untrivial and then everything fells apart. I don't like solutions, which speeds up 90% of cases, because they are easy any way. I like general usage languages, because they make those 10% of hard problems manageable
In my previous project we had some XSLT enriched with some java code, because of course XSLT was not enough. It was just horrible
In the past XSLT has one huge advantage: it could be used from any programming language, which is a plus. Nowadays we have web assembly widely supported everywhere, which is more general and powerful
13utters@reddit (OP)
TIL you can run XSLT 2.0 in WASM (via Saxon)
TypeComplex2837@reddit
Shit will never die.. 25 years in I've still got a well-paid backlog several years long with this crap 🤣