Built a Nepali calendar computation engine in Python, turns out there's no formula for it
Posted by Natural-Sympathy-195@reddit | Python | View on Reddit | 19 comments
What My Project Does
Project Parva is a REST API that computes Bikram Sambat (Nepal's official calendar) dates, festival schedules, panchanga (lunar almanac), muhurta (auspicious time windows), and Vedic birth charts. It derives everything from real planetary positions using pyswisseph rather than serving hardcoded lookup tables. Takes actual lat/lon coordinates so calculations are accurate for any location, not just Kathmandu.
Target Audience
Developers building apps that need Nepali calendar data programmatically. Could be production use for something like a scheduling app, a diaspora-focused product, or an AI agent that needs grounded Nepali date data. The API is public beta so the contract is stable but not yet v1. There's also a Python SDK if you want to skip the HTTP boilerplate.
Comparison
Most existing options are either NPM packages with hardcoded month-length arrays that break outside a fixed year range (usually 2000-2090 BS), or static JSON files someone manually typed from government PDFs. Both approaches fail for future dates and neither accounts for geographic location in sunrise-dependent calculations. Hamro Patro is the dominant consumer app but has no public API, so developers end up writing scrapers that break constantly. Parva computes everything from Swiss Ephemeris, which means it works for any year and any coordinates.
2ndBrainAI@reddit
This is fascinating work! Using Swiss Ephemeris to compute calendar dates from actual planetary positions instead of relying on brittle hardcoded tables is such a cleaner approach. I love that it handles geographic coordinates too — sunrise calculations really do vary significantly by location.
The comparison to existing NPM packages with fixed year ranges (2000-2090 BS) really highlights why this was needed. Those hardcoded arrays are always a maintenance nightmare.
Have you run into any interesting edge cases with the panchanga calculations? I'd imagine certain lunar phases might produce some tricky ambiguities depending on the observer's exact coordinates.
Winter-Flan7548@reddit
That's fascinating...would be glad to help iif you need some
Winter-Flan7548@reddit
also, using my project removes the hassle of the APGL license if you truly wanted to open source it
Natural-Sympathy-195@reddit (OP)
Checked it out properly, and the reduction pipeline is genuinely impressive. A pure-Python stack built around DE441 plus explicit IAU 2000A/2006 reductions is, architecturally, a much more auditable approach than treating Swiss Ephemeris as a black box. For my use case, the real constraint is deployment economics more than mathematical taste. A multi-GB kernel footprint is a hard sell for a public API running on low-cost/free-tier infrastructure, whereas pyswisseph gives me a much lighter operational profile for the calendar range I actually need.
So yeah, the MIT route is definitely attractive, but I’d have to solve the infra tradeoff before it becomes a realistic foundation for Parva.
Still, this is absolutely the kind of project I want on my radar, and I can see it being very useful as a validation/reference engine even before it’s a direct backend candidate. If you push further into Vedic calendar systems, I’d be especially interested. Do you have BS sankranti computation on the roadmap?
Winter-Flan7548@reddit
Yeah, it will actually run off of any kernel, I pushed to DE441 because of the date range that it supports, but it can definitely use DE440, or even the older ones really. I need to correct that in the documents and make make sure it is kernel agnostic. And use, calendar systems are actually my next real push that I will do, as I understand that being able to speak astrology in different calendar systems is important. Thank you for looking at it, and I appreciate the input.
Natural-Sympathy-195@reddit (OP)
Makes sense. I’ll keep an eye on it as you push further into calendar systems.
lewd_peaches@reddit
That's a cool project! I ran into a similar situation building a custom loss function for a niche ML problem. Thought there'd be some elegant closed-form solution, but ended up needing to approximate with a lookup table and a ton of interpolation.
Did you try any optimization techniques after the initial implementation? For instance, could you precompute and cache sections of the calendar, or parallelize the calculations if you're dealing with large batches of dates?
I sometimes use OpenClaw for that kind of thing, basically turning a embarrassingly parallelizable task into a distributed compute job. For example, I once used it to generate a large synthetic dataset (image augmentation, running various filters) - it took a few hours on a single machine, but dropping it onto a cluster of 8 GPUs with OpenClaw cut it down to about 30 minutes. The cost was negligible, maybe a dollar or two worth of GPU time. Might be overkill for your calendar, but something to consider if performance becomes critical.
Natural-Sympathy-195@reddit (OP)
the ML loss function analogy actually maps pretty well, same situation where you're hoping for a clean closed-form and end up humbled by something that's been empirically refined over millennia
on optimization, the interesting thing is the performance profile is probably the opposite of what you'd expect. a single ephemeris call for planetary position is microseconds. computing an entire year of festival dates is maybe 50-100ms total on a single thread, which is already fast enough that caching is the main lever worth pulling, not parallelism. i do precompute festival dates on first request per year and cache them, so repeat calls are essentially free.
the batch case is real though. if someone hits `/calendar/range?start=2080&end=2200` you want multiprocessing there, and python's embarrassingly parallel story is fine for that since each date is fully independent. standard ProcessPoolExecutor handles it cleanly.
the GPU clustering angle is interesting for your image augmentation case but would be fighting the wrong bottleneck here. the nutation series (1365 lunisolar terms) is dense polynomial evaluation that maps well onto SSE/AVX on a single CPU core, not GPU parallelism. numpy already vectorizes most of it. the actual constraint for a calendar API is network I/O and cold start latency, not compute. throwing a GPU cluster at it would be like renting a cargo ship to deliver a letter.
what was the niche ML problem if you don't mind sharing? curious what the loss function was approximating
InebriatedPhysicist@reddit
How on earth do you know that the JSONs are manually typed from PDFs?
Natural-Sympathy-195@reddit (OP)
dethb0y@reddit
hats off to you, i hate working with anything involving dates, let alone something this complicated!
FibonacciSpiralOut@reddit
standard datetime math is already enough to make most programmers cry, so calculating actual planetary physics just to get a date is an absolutly massive flex
End0rphinJunkie@reddit
Standard timezone drift across k8s clusters is already enough of a headache. I can't even imagine debugging a failed cron job becuase Jupiter was in the wrong position.
Alarmed-Subject-7243@reddit
working with time and calandars is seriously the ultimate boss fight of programming. plugging astronomical data straight into an agentic toolbelt is a brilliant way to stop LLMs from hallucinating dates and just let the math do the heavly lifting.
Natural-Sympathy-195@reddit (OP)
honestly same, i did not sign up to learn this much about lunar angular distances when i started. dates are miserable enough in regular programming, adding "but which calendar system and also where is the sun right now" makes it a special kind of painful.
FrickinLazerBeams@reddit
I didn't know there was a Nepali calendar and I don't need this, but it seems like a really cool piece of work, and solves the problem in the way I'd like, if I were looking for such a thing.
And it's nice to see something that's real programming, not just another vibe coded AI slop project.
skool_101@reddit
great work bro. crosspost at r/technepal as well
cinyar@reddit
...and I thought dealing with timezones is annoying.
Last_Emu_1376@reddit
That’s programming in a nutshell: you dive in excited, then realize all the easy solutions don’t exist. Respect for figuring it out anyway!