Quoting Tristan Cacqueray (2019-02-12 00:56:03)
On Mon, Feb 11, 2019 at 22:52 Clint Byrum wrote:
Quoting Clint Byrum (2019-02-11 15:02:35)
Quoting David Shrewsbury (2019-02-11 10:58:05)
On Mon, Feb 11, 2019 at 1:21 PM Monty Taylor <[1]mordred@inaugust.com> wrote: So ... what do people think? Should the Zuul project adopt a small c++ service?
I think this is fine that it's in C++, IMO. If it really becomes burdensome for some unforeseen reason, it's also simple enough to rewrite in something else (but not java.... NEVER java).
You'll get more enthusiasm from me if it's in Rust, but C++ is fine, meh, w'ever.
BTW, I put my code where my mouth is:
https://github.com/SpamapS/upgraded-octo-giggle/blob/master/src/main.rs
I had to alter it a bit to support white-labeled Zuul API, but otherwise I think it works the same as the C++ version... and yay, a memory safe language with C++-like performance... in theory.
I was wondering if you could provide a reference test and docs of some kind for how these should work? I've fed it urls like you specified, and they seem to work with builds, but I'm not sure how the error handling is supposed to work, etc.
What is the performance issue if this rewrite map was written in Python? For benchmark purpose, please find my implementation below. -Tristan
```python from functools import lru_cache import requests
@lru_cache(maxsize=20) def get(r): api, vhost = r.split() artifact, build, tenant, _ = vhost.split('.', 3) return requests.get("%s/api/tenant/%s/build/%s" % ( api, tenant, build)).json()["log_url"]
while True: try: print(get(input())) except: print("error") ```
This code is approximately 10x slower in a single thread than the Rust or the C++, which are virtually the same (actually the Rust gets better the more LRU expirations it does). I added a requests Session object and got it to more like 7x slower, but the python still is much slower. Very scientific tests on my laptop (you can find them in the linked github fork under tests). These are all typical of repeated runs: Python: $ /usr/bin/time python3 ../octo.py < api/manifest.txt > /dev/null 11.88user 0.41system 0:13.39elapsed 91%CPU (0avgtext+0avgdata 28592maxresident)k 0inputs+0outputs (0major+12340minor)pagefaults 0swaps C++: $ /usr/bin/time ../zuul-preview/zuul-preview < api/manifest.txt > /dev/null 1.33user 0.94system 0:01.96elapsed 115%CPU (0avgtext+0avgdata 9880maxresident)k 0inputs+0outputs (0major+895minor)pagefaults 0swaps Rust: $ /usr/bin/time ../target/release/upgraded-octo-giggle < api/manifest.txt > /dev/null 0.69user 0.32system 0:01.35elapsed 75%CPU (0avgtext+0avgdata 8516maxresident)k 0inputs+0outputs (0major+618minor)pagefaults 0swaps This is doing 10,000 unique simulated builds with a local nginx server serving the build responses statically. The more builds, the more Rust wins, suggesting that the LRU expiration in C++ has a performance problem. Anyway, is this code really performance sensitive enough to warrant a new language?