Skip to content

Conversation

@stancikcom
Copy link

Hi, you did a nice job with fiole despite alpha status.
Before I found your project, I though as well to merge a speed of wheezy with lean syntax / simplicity / features of itty, flask.... ;-)

I have tried some simple benchmarks with a demo " hello word" app.
With "gevent server" patch, the results were rather impressive.

ApacheBech results:
Concurrency Level: 100
Time taken for tests: 2.794 seconds
Complete requests: 10000
Failed requests: 0
Total transferred: 1470000 bytes
HTML transferred: 120000 bytes
Requests per second: 3578.54 #/sec (mean)
Time per request: 27.944 ms (mean)
Time per request: 0.279 sec. (mean, across all concurrent requests)
Transfer rate: 513.72 Kbytes/sec received

Setup: App server running at localhost callback, ApacheBenchmark running from the same computer (Notebook Lenovo u310 with Intel(R) Core(TM) i7-3517U CPU @ 1.90GHz, OS: Ubuntu 14.04 x64, Python 2.7 x64)

Hi, you did a nice job with fiole despite alpha status. 
Before I found your project, I though as well to merge a speed of wheezy with lean syntax / simplicity / features of itty, flask.... ;-)

I have tried some simple benchmarks with a demo " hello word" app. 
With "gevent server" patch, the results were rather impressive. 

ApacheBech results:
Concurrency Level:      100
Time taken for tests:   2.794 seconds
Complete requests:      10000
Failed requests:        0
Total transferred:      1470000 bytes
HTML transferred:       120000 bytes
Requests per second:    3578.54 [#/sec] (mean)
Time per request:       27.944 [ms] (mean)
Time per request:       0.279 [ms] (mean, across all concurrent requests)
Transfer rate:          513.72 [Kbytes/sec] received

Setup: App server as well as ApacheBench running from the same computer (Notebook Lenovo u310 with Intel(R) Core(TM) i7-3517U CPU @ 1.90GHz, OS: Ubuntu 14.04 x64,  Python 2.7 x64)
With suppresed console logging reached 8k Rps.

ms@ms-Lenovo-U310:~$ ab -c 1000 -n 10000 localhost:8080/ 
Server Software:        Bjoern
Server Hostname:        localhost
Server Port:            8080

Document Path:          /
Document Length:        12 bytes

Concurrency Level:      1000
Time taken for tests:   1.218 seconds
Complete requests:      10000
Failed requests:        0
Total transferred:      1100000 bytes
HTML transferred:       120000 bytes
Requests per second:    8209.94 [#/sec] (mean)
Time per request:       121.804 [ms] (mean)
Time per request:       0.122 [ms] (mean, across all concurrent requests)
Transfer rate:          881.93 [Kbytes/sec] received

Connection Times (ms)
              min  mean[+/-sd] median   max
Connect:        0    3   8.8      0      38
Processing:     9   20  17.5     15     423
Waiting:        9   19  16.3     15     423
Total:          9   23  24.6     15     424

Percentage of the requests served within a certain time (ms)
  50%     15
  66%     15
  75%     15
  80%     15
  90%     67
  95%     89
  98%    103
  99%    112
 100%    424 (longest request)
@florentx
Copy link
Owner

florentx commented Feb 1, 2015

Martin, thank you for your benchmark.

Is there any benefit to add adapters for each third-party server in Fiole? I see that Bottle and some other projects do this, but I don't see why we need it at all. It's quite simple to add this in user code, for example.

I wrote a small line in the FAQ for it http://fiole.readthedocs.org/en/latest/developer.html#how-much-is-it-extensible

Only the adapter for the wsgiref server is provided. You can write your own adapter for
your preferred WSGI server. There are examples available in Bottle or itty.py source code for example.

Maybe it makes sense to extend this with a section in the documentation with example snippet(s) for Björn or Gevent, and add a mention about the performance benefit?

@stancikcom
Copy link
Author

I think it is a convenience issue. I believe for a newbie (or pragmatic programmer with a deadline) it would be more convenient to have basic adapters included (not only for wsgi, but other most common functionilites e.g. beaker that are missing).

I believe that short learning curve and flexibility is one of reasons why people with opt for micro-framework... Nevertheless I understand and fully respect your point to keep your core code lean.

So, what can be done is follows:

  1. enhancing documentation with practical deployment tutorials and examples
  2. adding unittests to determine possible performance / compatibility issues with various plugins
  3. enhance functionality making separated git-hub projects e.g. fiole-addon-{funtionality}

Let me know which of those to focus.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants