Version 0.2.8

master
Rob Glew 9 years ago
parent f4274e1e82
commit 9a58a915c2
  1. 4
      MANIFEST.in
  2. 199
      README.md
  3. 5
      docs/source/conf.py
  4. 6
      pappyproxy/Makefile
  5. 1
      pappyproxy/__init__.py
  6. 25
      pappyproxy/comm.py
  7. 347
      pappyproxy/config.py
  8. 252
      pappyproxy/console.py
  9. 219
      pappyproxy/context.py
  10. 155
      pappyproxy/http.py
  11. 4
      pappyproxy/iter.py
  12. 6
      pappyproxy/macros.py
  13. 254
      pappyproxy/pappy.py
  14. 25
      pappyproxy/plugin.py
  15. 6
      pappyproxy/plugins/debug.py
  16. 3
      pappyproxy/plugins/filter.py
  17. 3
      pappyproxy/plugins/macrocmds.py
  18. 8
      pappyproxy/plugins/manglecmds.py
  19. 13
      pappyproxy/plugins/misc.py
  20. 3
      pappyproxy/plugins/tagcmds.py
  21. 11
      pappyproxy/plugins/view.py
  22. 172
      pappyproxy/proxy.py
  23. 1
      pappyproxy/requestcache.py
  24. 179
      pappyproxy/site.py
  25. 11
      pappyproxy/site/base.html
  26. 6
      pappyproxy/site/certs.html
  27. 8
      pappyproxy/site/index.html
  28. 8
      pappyproxy/site/norsp.html
  29. 1
      pappyproxy/site/static/test.html
  30. 6
      pappyproxy/site/viewrsp.html
  31. 112
      pappyproxy/tests/test_comm.py
  32. 43
      pappyproxy/tests/test_http.py
  33. 318
      pappyproxy/tests/test_proxy.py
  34. 20
      pappyproxy/tests/testutil.py
  35. 269
      pappyproxy/util.py
  36. 3
      setup.py

@ -5,4 +5,6 @@ recursive-include pappyproxy *.py
recursive-include pappyproxy *.vim
recursive-include pappyproxy *.txt
recursive-include pappyproxy *.template
include docs/source/overview.rst
recursive-include pappyproxy *.template
recursive-include pappyproxy/site *
include docs/source/overview.rst

@ -2,6 +2,67 @@ The Pappy Proxy
===============
[Documentation](https://roglew.github.io/pappy-proxy/) - [Tutorial](https://roglew.github.io/pappy-proxy/tutorial.html)
Table of Contents
=================
* [Overview](#overview)
* [Introduction](#introduction)
* [Contributing](#contributing)
* [I still like Burp, but Pappy looks interesting, can I use both?](#i-still-like-burp-but-pappy-looks-interesting-can-i-use-both)
* [How to Use It](#how-to-use-it)
* [Installation](#installation)
* [Quickstart](#quickstart)
* [Lite Mode](#lite-mode)
* [Adding The CA Cert to Your Browser](#adding-the-ca-cert-to-your-browser)
* [Firefox](#firefox)
* [Chrome](#chrome)
* [Safari](#safari)
* [Internet Explorer](#internet-explorer)
* [Configuration](#configuration)
* [General Console Techniques](#general-console-techniques)
* [Run a shell command](#run-a-shell-command)
* [Running Python Code](#running-python-code)
* [Redirect Output To File](#redirect-output-to-file)
* [Generating Pappy's CA Cert](#generating-pappys-ca-cert)
* [Browsing Recorded Requests/Responses](#browsing-recorded-requestsresponses)
* [Tags](#tags)
* [Request IDs](#request-ids)
* [Passing Multiple Request IDs to a Command](#passing-multiple-request-ids-to-a-command)
* [Context](#context)
* [Filter Strings](#filter-strings)
* [List of fields](#list-of-fields)
* [List of comparers](#list-of-comparers)
* [Special form filters](#special-form-filters)
* [Scope](#scope)
* [Built-In Filters](#built-in-filters)
* [Decoding Strings](#decoding-strings)
* [Interceptor](#interceptor)
* [Repeater](#repeater)
* [Macros](#macros)
* [Generating Macros From Requests](#generating-macros-from-requests)
* [Request Objects](#request-objects)
* [Useful Functions](#useful-functions)
* [Intercepting Macros](#intercepting-macros)
* [Enabling/Disabling Intercepting Macros](#enablingdisabling-intercepting-macros)
* [Logging](#logging)
* [Additional Commands and Features](#additional-commands-and-features)
* [Response streaming](#response-streaming)
* [Viewing Responses In Browser](#viewing-responses-in-browser)
* [Plugins](#plugins)
* [Should I Write a Plugin or a Macro?](#should-i-write-a-plugin-or-a-macro)
* [Global Settings](#global-settings)
* [Using an HTTP Proxy](#using-an-http-proxy)
* [Using a SOCKS Proxy](#using-a-socks-proxy)
* [Transparent Host Redirection](#transparent-host-redirection)
* [FAQ](#faq)
* [Why does my request have an id of --?!?!](#why-does-my-request-have-an-id-of---)
* [Boring, Technical Stuff](#boring-technical-stuff)
* [Request Cache / Memory usage](#request-cache--memory-usage)
* [Changelog](#changelog)
Overview
========
Introduction
------------
The Pappy (**P**roxy **A**ttack **P**roxy **P**rox**Y**) Proxy is an intercepting proxy for performing web application security testing. Its features are often similar, or straight up rippoffs from [Burp Suite](https://portswigger.net/burp/). However, Burp Suite is neither open source nor a command line tool, thus making a proxy like Pappy inevitable. The project is still in its early stages, so there are bugs and only the bare minimum features, but it can already do some cool stuff.
@ -16,6 +77,18 @@ Another option is to try writing a plugin. It might be a bit easier than contrib
You can find ideas for features to add on [the contributing page in the docs](https://roglew.github.io/pappy-proxy/contributing.html).
I still like Burp, but Pappy looks interesting, can I use both?
---------------------------------------------------------------
Yes! If you don't want to go completely over to Pappy yet, you can configure Burp to use Pappy as an upstream proxy server. That way, traffic will go through both Burp and Pappy and you can use whichever you want to do your testing.
How to have Burp forward traffic through Pappy:
1. Open Burp
2. Go to `Options -> Connections -> Upstream Proxy Servers`
3. Click `Add`
4. Leave `Destination Host` blank, but put `127.0.0.1` in `Proxy Host` and `8000` into `Port` (assuming you're using the default listener)
5. Configure your browser to use Burp as a proxy
How to Use It
=============
@ -46,6 +119,21 @@ $
And that's it! The proxy will by default be running on port 8000 and bound to localhost (to keep the hackers out). You can modify the port/interface in `config.json`. You can list all your intercepted requests with `ls`, view a full request with `vfq <reqid>` or view a full response with `vfs <reqid>`. Right now, the only command to delete requests is `filter_prune` which deletes all the requests that aren't in the current context (look at the sections on the context/filter strings for more information on that).
Here's everything you need to know to get the basics done:
* This quickstart assumes you've used Burp Suite
* Make a directory for your project and `cd` into it in the terminal. Type `pappy` into the terminal and hit enter
* Commands are entered into the prompt that appears
* The proxy starts listening on port 8000 once the program starts
* Use `ls` to look at recent requests, `ls a` to look at the entire history
* You will use the number in the `id` column to perform actions on that request
* Use `vfq <id>` and `vfs <id>` to view full requests/responses
* Use `ic` to modify requests with a text editor as they go through the proxy or `ic req rsp` to modify both requests and responses
* Use `rp <id>` to send a request to the repeater. In the repeater, use `<leader>f` to send the current buffer (you may need to configre a leader key in vim). Use `:qa!` to quit the repeater.
If you want to do more, I highly suggest reading the whole readme!
Lite Mode
---------
If you don't want to dirty up a directory, you can run Pappy in "lite" mode. Pappy will use the default configuration settings and will create a temporary data file in `/tmp` to use. When you quit, the file will be deleted. If you want to run Pappy in lite mode, run Pappy with either `-l` or `--lite`.
@ -135,8 +223,8 @@ Type "help", "copyright", "credits" or "license" for more information.
Non-python commands can be issued with ``cmd("your command")``.
Run python code from external files with ``run("filename.py")``
>>> from pappyproxy import config
>>> config.CONFIG_DICT
>>> from pappyproxy import pappy
>>> pappy.session.config.config_dict
{u'data_file': u'./data.db', u'history_size': 1000, u'cert_dir': u'{DATADIR}/certs', u'proxy_listeners': [{u'interface': u'127.0.0.1', u'port': 8000}]}
>>> exit()
pappy>
@ -218,9 +306,8 @@ The following commands can be used to view requests and responses
| Command | Aliases | Description |
|:--------|:--------|:------------|
| `ls [a|<num>`]| list, ls |List requests that are in the current context (see Context section). Has information like the host, target path, and status code. With no arguments, it will print the 25 most recent requests in the current context. If you pass 'a' or 'all' as an argument, it will print all the requests in the current context. If you pass a number "n" as an argument, it will print the n most recent requests in the current context. |
| `sm` [p] | sm, site_map | Print a tree showing the site map. It will display all requests in the current context that did not have a 404 response. This has to go through all of the requests in the current context so it may be slow. If the `p` option is given, it will print the paths as paths rather than as a tree. |
| `viq <id(s)>` | view_request_info, viq | View additional information about requests. Includes the target port, if SSL was used, applied tags, and other information. |
| `ls [a|<num>]`| list, ls |List requests that are in the current context (see Context section). Has information like the host, target path, and status code. With no arguments, it will print the 25 most recent requests in the current context. If you pass 'a' or 'all' as an argument, it will print all the requests in the current context. If you pass a number "n" as an argument, it will print the n most recent requests in the current context. |
| `sm [p]` | sm, site_map | Print a tree showing the site map. It will display all requests in the current context that did not have a 404 response. This has to go through all of the requests in the current context so it may be slow. If the `p` option is given, it will print the paths as paths rather than as a tree. | | `viq <id(s)>` | view_request_info, viq | View additional information about requests. Includes the target port, if SSL was used, applied tags, and other information. |
| `vfq <id(s)>` | view_full_request, vfq, kjq | [V]iew [F]ull Re[Q]uest, prints the full request including headers and data. |
| `vbq <id(s)>` | view_request_bytes, vbq | [V]iew [B]ytes of Re[Q]uest, prints the full request including headers and data without coloring or additional newlines. Use this if you want to write a request to a file. |
| `ppq <format> <id(s)> ` | pretty_print_request, ppq | Pretty print a request with a specific format. See the table below for a list of formats. |
@ -230,7 +317,7 @@ The following commands can be used to view requests and responses
| `vbs <id(s)>` | view_response_bytes, vbs | [V]iew [B]ytes of Re[S]ponse, prints the full response including headers and data without coloring or additional newlines. Use this if you want to write a response to a file. |
| `pps <format> <id(s)>` | pretty_print_response, pps | Pretty print a response with a specific format. See the table below for a list of formats. |
| `pprm <id(s)>` | print_params, pprm | Print a summary of the parameters submitted with the request. It will include URL params, POST params, and/or cookies |
| `pri [ct] [key(s)] | param_info, pri | Print a summary of the parameters and values submitted by in-context requests. You can pass in keys to limit which values will be shown. If you also provide `ct` as the first argument, it will include any keys that are passed as arguments. |
| `pri [ct] [key(s)]` | param_info, pri | Print a summary of the parameters and values submitted by in-context requests. You can pass in keys to limit which values will be shown. If you also provide `ct` as the first argument, it will include any keys that are passed as arguments. |
| `watch` | watch | Print requests and responses in real time as they pass through the proxy. |
Available formats for `ppq` and `pps` commands:
@ -528,7 +615,7 @@ When you're done with repeater, run ":qa!" to avoid having to save changes to no
| Vim Command | Keybinding | Action |
|:--------|:-----------|:-------|
| `RepeaterSubmitBuffer` | <leader>f | Submit the current buffer, split the windows vertically, and show the result in the right window |
| `RepeaterSubmitBuffer` | `<leader>f` | Submit the current buffer, split the windows vertically, and show the result in the right window |
Macros
------
@ -544,42 +631,13 @@ $ ls -l
-rw-r--r-- 1 scaryhacker wheel 241 Nov 26 17:18 macro_test.py
```
In this case we have a `blank`, `hackthensa`, `testgen`, and `test` macro. A macro script is any python script that defines a `run_macro(args)` function and a `MACRO_NAME` variable. For example, a simple macro would be:
In this case we have a `blank`, `hackthensa`, `testgen`, and `test` macro. A macro script is any python script that defines a `run_macro(args)` function and a `MACRO_NAME` variable. To start with, we'll write a macro to iterate over a numbered image to try and find other images. We will take the following steps to do it:
```
### macro_print.py
MACRO_NAME = 'Print Macro'
def run_macro(args):
if args:
print "Hello, %s!" % args[0]
else:
print "Hello, Pappy!"
```
You can place this macro in your project directory then load and run it from Pappy. When a macro is run, arguments are passed from the command line. Arguments are separated the same way as they are on the command line, so if you want to use spaces in your argument, you have to put quotes around it.
```
$ pappy
Proxy is listening on port 8000
pappy> lma
Loaded "<Macro Test Macro (tm/test)>"
Loaded "<Macro Macro 6494496 (testgen)>"
Loaded "<Macro Print Macro (print)>"
Loaded "<Macro Hack the NSA (htnsa/hackthensa)>"
Loaded "<Macro Macro 62449408 (blank)>"
pappy> rma print
Hello, Pappy!
pappy> rma print NSA
Hello, NSA!
pappy> rma print Idiot Slayer
Hello, Idiot!
pappy> rma print "Idiot Slayer"
Hello, Idiot Slayer!
```
You'll need to run `lma` every time you make a change to the macro in order to reload it. In addition, any code outside of the `run_macro` function will be run when it the macro gets loaded.
1. Make a request to the image
2. Generate a macro using the `gma` command
3. Write a loop to copy the original request, modify it, then submit it with different numbers
4. Load the macro in Pappy with the `lma` command
5. Run the macro with the `rma` command
### Generating Macros From Requests
@ -633,6 +691,18 @@ def run_macro(args):
If you enter in a value for `SHORT_NAME`, you can use it as a shortcut to run that macro. So if in a macro you set `SHORT_NAME='tm'` you can run it by running `pappy> rma tm`.
### Passing Arguments to Macros
When you run the macro, any additional command line arguments will be passed to the run_macro function in the `args` argument. For example, if you run your macro using
```
pappy> rma foo thisis an "amazing argument"
```
The `args` argument of run_macro will be `["thisis", "an", "amazing argument"]`. If no arguments are give, `args` will be an empty list.
### Macro Commands
| Command | Aliases | Description |
|:--------|:--------|:------------|
| `lma [dir]` | `load_macros`, `lma` | Load macros from a directory. If `dir` is not given, use the current directory (the project directory) |
@ -848,6 +918,10 @@ This is a list of other random stuff you can do that isn't categorized under any
If you don't have any intercepting macros running, Pappy will forward data to the browser as it gets it. However, if you're trying to mangle messages/responses, Pappy will need to download the entire message first.
Viewing Responses In Browser
----------------------------
You can view responses in your browser by visiting `http://pappy/rsp/<rspid>` (NOT pappy.com) in your browser while connected to the proxy. For example, if you want to view the response to request 123, you can visit `http://pappy/rsp/123` to view the response. Pappy will return a response with the same body as the original response and will not make a request to the server. The response will not have the same headers as the original response (aside from the Content-Type header). In addition, Pappy doesn't modify any URLs in the page which means your browser will still fetch external resources like images, JavaScript etc from external servers.
Plugins
-------
Note that this section is a very quick overview of plugins. For a full description of how to write them, please see [the official docs](https://roglew.github.io/pappy-proxy/pappyplugins.html).
@ -944,9 +1018,25 @@ Settings included in `~/.pappy/global_config.json`:
|:--------|:------------|
| cache_size | The number of requests from history that will be included in memory at any given time. Set to -1 to keep everything in memory. See the request cache section for more info. |
Using a SOCKS Server
--------------------
Pappy allows you to use an upstream SOCKS server. You can do this by adding a `socks_proxy` value to config.json. You can use the following for anonymous access to the proxy:
Using an HTTP Proxy
-------------------
Pappy allows you to use an upstream HTTP proxy. You can do this by adding an `http_proxy` value to config.json. You can use the following for anonymous access to the proxy:
```
"http_proxy": {"host":"httpproxy.proxy.host", "port":5555}
```
To use credentials you add a `username` and `password` value to the dictionary:
```
"http_proxy": {"host":"httpproxy.proxy.host", "port":5555, "username": "mario", "password":"ilovemushrooms"}
```
At the moment, only basic auth is supported. Anything in-scope that passes through any of the active listeners will use the proxy. Out of scope requests will not be sent through the proxy.
Using a SOCKS Proxy
-------------------
Pappy allows you to use an upstream SOCKS proxy. You can do this by adding a `socks_proxy` value to config.json. You can use the following for anonymous access to the proxy:
```
"socks_proxy": {"host":"socks.proxy.host", "port":5555}
@ -958,7 +1048,7 @@ To use credentials you add a `username` and `password` value to the dictionary:
"socks_proxy": {"host":"socks.proxy.host", "port":5555, "username": "mario", "password":"ilovemushrooms"}
```
Anything that passes through any of the active listeners will use the proxy.
Any in-scope requests that pass through any of the active listeners will use the proxy. Out of scope requests will not be sent through the proxy.
Transparent Host Redirection
----------------------------
@ -1016,17 +1106,6 @@ Pappy will automatically use this host to make the connection and forward the re
FAQ
---
### I still like Burp, but Pappy looks interesting, can I use both?
Yes! If you don't want to go completely over to Pappy yet, you can configure Burp to use Pappy as an upstream proxy server. That way, traffic will go through both Burp and Pappy and you can use whichever you want to do your testing.
How to have Burp forward traffic through Pappy:
1. Open Burp
2. Go to `Options -> Connections -> Upstream Proxy Servers`
3. Click `Add`
4. Leave `Destination Host` blank, but put `127.0.0.1` in `Proxy Host` and `8000` into `Port` (assuming you're using the default listener)
5. Configure your browser to use Burp as a proxy
### Why does my request have an id of `--`?!?!
You can't do anything with a request/response until it is decoded and saved to disk. In between the time when a request is decoded and when it's saved to disk, it will have an ID of `--`. So just wait a little bit and it will get an ID you can use.
@ -1044,6 +1123,12 @@ Changelog
---------
The boring part of the readme
* 0.2.8
* Upstream HTTP proxy support
* Usability improvements
* Docs docs docs
* Bugfixes, unit tests
* Add http://pappy functionality to view responses in the browser
* 0.2.7
* boring unit tests
* should make future releases more stable I guess

@ -15,6 +15,7 @@
import sys
import os
import shlex
import pappyproxy
# If extensions (or modules to document with autodoc) are in another directory,
# add these directories to sys.path here. If the directory is relative to the
@ -59,9 +60,9 @@ author = u'Rob Glew'
# built documents.
#
# The short X.Y version.
version = u'0.2.7'
version = pappyproxy.__version__
# The full version, including alpha/beta/rc tags.
release = u'0.2.7'
release = pappyproxy.__version__
# The language for content autogenerated by Sphinx. Refer to documentation
# for a list of supported languages.

@ -10,3 +10,9 @@ test-verbose:
test-macros:
py.test -v -rw --twisted tests/test_macros.py
test-proxy:
py.test -v -rw --twisted tests/test_proxy.py
test-comm:
py.test -v -rw --twisted tests/test_comm.py

@ -0,0 +1 @@
__version__ = '0.2.8'

@ -5,7 +5,6 @@ import json
from twisted.protocols.basic import LineReceiver
from twisted.internet import defer
from util import PappyException
from .http import Request, Response
"""
comm.py
@ -13,13 +12,8 @@ Handles creating a listening server bound to localhost that other processes can
use to interact with the proxy.
"""
comm_port = 0
debug = True
def set_comm_port(port):
global comm_port
comm_port = port
class CommServer(LineReceiver):
MAX_LENGTH=sys.maxint
@ -33,7 +27,6 @@ class CommServer(LineReceiver):
}
def lineReceived(self, line):
from .http import Request, Response
line = line.strip()
if line == '':
@ -61,12 +54,10 @@ class CommServer(LineReceiver):
def action_error_handler(self, error, result):
if debug:
print error.getTraceback()
return_data = {'success': False, 'message': 'Debug mode enabled, traceback on main terminal'}
else:
return_data = {'success': False, 'message': str(error.getErrorMessage())}
result.update(result)
self.sendLine(json.dumps(return_data))
error.trap(Exception)
return_data = {'success': False, 'message': str(error.getErrorMessage())}
result.update(result)
error.trap(Exception)
self.sendLine(json.dumps(return_data))
return True
def action_ping(self, data):
@ -74,6 +65,7 @@ class CommServer(LineReceiver):
@defer.inlineCallbacks
def action_get_request(self, data):
from .http import Request
try:
reqid = data['reqid']
req = yield Request.load_request(reqid)
@ -85,6 +77,7 @@ class CommServer(LineReceiver):
@defer.inlineCallbacks
def action_get_response(self, data):
from .http import Request, Response
try:
reqid = data['reqid']
req = yield Request.load_request(reqid)
@ -100,8 +93,12 @@ class CommServer(LineReceiver):
@defer.inlineCallbacks
def action_submit_request(self, data):
from .http import Request
message = base64.b64decode(data['full_message'])
req = yield Request.submit_new(data['host'].encode('utf-8'), data['port'], data['is_ssl'], message)
try:
req = yield Request.submit_new(data['host'].encode('utf-8'), data['port'], data['is_ssl'], message)
except Exception:
raise PappyException('Error submitting request. Please make sure request is a valid HTTP message.')
if 'tags' in data:
req.tags = set(data['tags'])
yield req.async_deep_save()

@ -1,227 +1,234 @@
"""
The configuration settings for the proxy.
.. data:: CERT_DIR
import json
import os
import shutil
class PappyConfig(object):
"""
The configuration settings for the proxy. To access the config object for the
current session (eg from plugins) use ``pappyproxy.pappy.session.config``.
.. data:: cert_dir
The location of the CA certs that Pappy will use. This can be configured in the
``config.json`` file for a project.
:Default: ``{DATADIR}/certs``
.. data:: PAPPY_DIR
.. data:: pappy_dir
The file where pappy's scripts are located. Don't write anything here, and you
probably don't need to write anything here. Use DATA_DIR instead.
:Default: Wherever the scripts are installed
.. data:: DATA_DIR
.. data:: data_dir
The data directory. This is where files that have to be read by Pappy every time
it's run are put. For example, plugins are stored in ``{DATADIR}/plugins`` and
certs are by default stored in ``{DATADIR}/certs``. This defaults to ``~/.pappy``
and isn't configurable right now.
:Default: ``~/.pappy``
.. data:: DATAFILE
.. data:: datafile
The location of the CA certs that Pappy will use. This can be configured in the
``config.json`` file for a project.
:Default: ``data.db``
.. data:: DEBUG_DIR
.. data:: debug_dir
The directory to write debug output to. Don't put this outside the project folder
since it writes all the request data to this directory. You probably won't need
to use this. Configured in the ``config.json`` file for the project.
:Default: None
.. data: LISTENERS
.. data: listeners
The list of active listeners. It is a list of tuples of the format (port, interface)
Not modifiable after startup. Configured in the ``config.json`` file for the project.
:Default: ``[(8000, '127.0.0.1')]``
.. data: SOCKS_PROXY
.. data: socks_proxy
Details for a SOCKS proxy. It is a dict with the following key/values::
host: The SOCKS proxy host
port: The proxy port
username: Username (optional)
password: Password (optional)
If null, no proxy will be used.
:Default: ``null``
.. data: http_proxy
Details for an upstream HTTP proxy. It is a dict with the following key/values::
host: The proxy host
port: The proxy port
username: Username (optional)
password: Password (optional)
If null, no proxy will be used.
.. data: PLUGIN_DIRS
.. data: plugin_dirs
List of directories that plugins are loaded from. Not modifiable.
:Default: ``['{DATA_DIR}/plugins', '{PAPPY_DIR}/plugins']``
.. data: SAVE_HISTORY
.. data: save_history
Whether command history should be saved to a file/loaded at startup.
:Default: True
.. data: CONFIG_DICT
.. data: config_dict
The dictionary read from config.json. When writing plugins, use this to load
configuration options for your plugin.
.. data: GLOBAL_CONFIG_DICT
.. data: global_config_dict
The dictionary from ~/.pappy/global_config.json. It contains settings for
Pappy that are specific to the current computer. Avoid putting settings here,
especially if it involves specific projects.
"""
"""
import json
import os
import shutil
PAPPY_DIR = os.path.dirname(os.path.realpath(__file__))
DATA_DIR = os.path.join(os.path.expanduser('~'), '.pappy')
CERT_DIR = os.path.join(DATA_DIR, 'certs')
DATAFILE = 'data.db'
DEBUG_DIR = None
DEBUG_TO_FILE = False
DEBUG_VERBOSITY = 0
LISTENERS = [(8000, '127.0.0.1')]
SOCKS_PROXY = None
SSL_CA_FILE = 'certificate.crt'
SSL_PKEY_FILE = 'private.key'
HISTSIZE = 1000
def __init__(self):
self.pappy_dir = os.path.dirname(os.path.realpath(__file__))
self.data_dir = os.path.join(os.path.expanduser('~'), '.pappy')
PLUGIN_DIRS = [os.path.join(DATA_DIR, 'plugins'), os.path.join(PAPPY_DIR, 'plugins')]
self.cert_dir = os.path.join(self.data_dir, 'certs')
CONFIG_DICT = {}
GLOBAL_CONFIG_DICT = {}
self.datafile = 'data.db'
def get_default_config():
default_config_file = os.path.join(os.path.dirname(os.path.realpath(__file__)),
'default_user_config.json')
with open(default_config_file) as f:
settings = json.load(f)
return settings
self.debug_dir = None
self.debug_to_file = False
self.debug_verbosity = 0
def load_settings(proj_config):
global CERT_DIR
global DATAFILE
global DEBUG_DIR
global DEBUG_TO_FILE
global DEBUG_VERBOSITY
global LISTENERS
global SOCKS_PROXY
global PAPPY_DIR
global DATA_DIR
global SSL_CA_FILE
global SSL_PKEY_FILE
global HISTSIZE
self.listeners = [(8000, '127.0.0.1')]
self.socks_proxy = None
self.http_proxy = None
# Substitution dictionary
subs = {}
subs['PAPPYDIR'] = PAPPY_DIR
subs['DATADIR'] = DATA_DIR
self.ssl_ca_file = 'certificate.crt'
self.ssl_pkey_file = 'private.key'
# Data file settings
if 'data_file' in proj_config:
DATAFILE = proj_config["data_file"].format(**subs)
self.histsize = 1000
# Debug settings
if 'debug_dir' in proj_config:
if proj_config['debug_dir']:
DEBUG_TO_FILE = True
DEBUG_DIR = proj_config["debug_dir"].format(**subs)
self.plugin_dirs = [os.path.join(self.data_dir, 'plugins'), os.path.join(self.pappy_dir, 'plugins')]
# Cert directory settings
if 'cert_dir' in proj_config:
CERT_DIR = proj_config["cert_dir"].format(**subs)
# Listener settings
if "proxy_listeners" in proj_config:
LISTENERS = []
for l in proj_config["proxy_listeners"]:
ll = {}
if 'forward_host_ssl' in l:
l['forward_host_ssl'] = l['forward_host_ssl'].encode('utf-8')
if 'forward_host' in l:
l['forward_host'] = l['forward_host'].encode('utf-8')
LISTENERS.append(l)
# SOCKS proxy settings
if "socks_proxy" in proj_config:
SOCKS_PROXY = None
if proj_config['socks_proxy'] is not None:
conf = proj_config['socks_proxy']
if 'host' in conf and 'port' in conf:
SOCKS_PROXY = {}
SOCKS_PROXY['host'] = conf['host'].encode('utf-8')
SOCKS_PROXY['port'] = conf['port']
if 'username' in conf:
if 'password' in conf:
SOCKS_PROXY['username'] = conf['username'].encode('utf-8')
SOCKS_PROXY['password'] = conf['password'].encode('utf-8')
else:
print 'SOCKS proxy has a username but no password. Ignoring creds.'
else:
print 'Host is missing host/port.'
# History saving settings
if "history_size" in proj_config:
HISTSIZE = proj_config['history_size']
def load_global_settings(global_config):
from .http import Request
global CACHE_SIZE
if "cache_size" in global_config:
CACHE_SIZE = global_config['cache_size']
else:
CACHE_SIZE = 2000
Request.cache.resize(CACHE_SIZE)
def load_from_file(fname):
global CONFIG_DICT
# Make sure we have a config file
if not os.path.isfile(fname):
print "Copying default config to %s" % fname
self.config_dict = {}
self.global_config_dict = {}
def get_default_config(self):
default_config_file = os.path.join(os.path.dirname(os.path.realpath(__file__)),
'default_user_config.json')
shutil.copyfile(default_config_file, fname)
# Load local project config
with open(fname, 'r') as f:
CONFIG_DICT = json.load(f)
load_settings(CONFIG_DICT)
def global_load_from_file():
global GLOBAL_CONFIG_DICT
global DATA_DIR
# Make sure we have a config file
fname = os.path.join(DATA_DIR, 'global_config.json')
if not os.path.isfile(fname):
print "Copying default global config to %s" % fname
default_global_config_file = os.path.join(PAPPY_DIR,
'default_global_config.json')
shutil.copyfile(default_global_config_file, fname)
# Load local project config
with open(fname, 'r') as f:
GLOBAL_CONFIG_DICT = json.load(f)
load_global_settings(GLOBAL_CONFIG_DICT)
'default_user_config.json')
with open(default_config_file) as f:
settings = json.load(f)
return settings
@staticmethod
def _parse_proxy_login(conf):
proxy = {}
if 'host' in conf and 'port' in conf:
proxy = {}
proxy['host'] = conf['host'].encode('utf-8')
proxy['port'] = conf['port']
if 'username' in conf:
if 'password' in conf:
proxy['username'] = conf['username'].encode('utf-8')
proxy['password'] = conf['password'].encode('utf-8')
else:
print 'Proxy has a username but no password. Ignoring creds.'
else:
print 'Host is missing host/port.'
return None
return proxy
def load_settings(self, proj_config):
# Substitution dictionary
subs = {}
subs['PAPPYDIR'] = self.pappy_dir
subs['DATADIR'] = self.data_dir
# Data file settings
if 'data_file' in proj_config:
self.datafile = proj_config["data_file"].format(**subs)
# Debug settings
if 'debug_dir' in proj_config:
if proj_config['debug_dir']:
self.debug_to_file = True
self.debug_dir = proj_config["debug_dir"].format(**subs)
# Cert directory settings
if 'cert_dir' in proj_config:
self.cert_dir = proj_config["cert_dir"].format(**subs)
# Listener settings
if "proxy_listeners" in proj_config:
self.listeners = []
for l in proj_config["proxy_listeners"]:
if 'forward_host_ssl' in l:
l['forward_host_ssl'] = l['forward_host_ssl'].encode('utf-8')
if 'forward_host' in l:
l['forward_host'] = l['forward_host'].encode('utf-8')
self.listeners.append(l)
# SOCKS proxy settings
self.socks_proxy = None
if "socks_proxy" in proj_config:
if proj_config['socks_proxy'] is not None:
self.socks_proxy = PappyConfig._parse_proxy_login(proj_config['socks_proxy'])
# HTTP proxy settings
self.http_proxy = None
if "http_proxy" in proj_config:
if proj_config['http_proxy'] is not None:
self.http_proxy = PappyConfig._parse_proxy_login(proj_config['http_proxy'])
# History saving settings
if "history_size" in proj_config:
self.histsize = proj_config['history_size']
def load_global_settings(self, global_config):
from .http import Request
if "cache_size" in global_config:
self.cache_size = global_config['cache_size']
else:
self.cache_size = 2000
Request.cache.resize(self.cache_size)
def load_from_file(self, fname):
# Make sure we have a config file
if not os.path.isfile(fname):
print "Copying default config to %s" % fname
default_config_file = os.path.join(os.path.dirname(os.path.realpath(__file__)),
'default_user_config.json')
shutil.copyfile(default_config_file, fname)
# Load local project config
with open(fname, 'r') as f:
self.config_dict = json.load(f)
self.load_settings(self.config_dict)
def global_load_from_file(self):
# Make sure we have a config file
fname = os.path.join(self.data_dir, 'global_config.json')
if not os.path.isfile(fname):
print "Copying default global config to %s" % fname
default_global_config_file = os.path.join(self.pappy_dir,
'default_global_config.json')
shutil.copyfile(default_global_config_file, fname)
# Load local project config
with open(fname, 'r') as f:
self.global_config_dict = json.load(f)
self.load_global_settings(self.global_config_dict)

@ -3,20 +3,14 @@ Contains helpers for interacting with the console. Includes definition for the
class that is used to run the console.
"""
import StringIO
import atexit
import cmd2
import os
import re
import readline
import string
import sys
import itertools
from .util import PappyException
from .colors import Styles, Colors, verb_color, scode_color, path_formatter, host_color
from . import config
from twisted.internet import defer
from .colors import Colors
###################
## Helper functions
@ -29,229 +23,6 @@ def print_pappy_errors(func):
print str(e)
return catch
@defer.inlineCallbacks
def load_reqlist(line, allow_special=True, ids_only=False):
"""
load_reqlist(line, allow_special=True)
A helper function for parsing a list of requests that are passed as an
argument. If ``allow_special`` is True, then it will parse IDs such as
``u123`` or ``s123``. Even if allow_special is false, it will still parse
``m##`` IDs. Will print any errors with loading any of the requests and
will return a list of all the requests which were successfully loaded.
Returns a deferred.
:Returns: Twisted deferred
"""
from .http import Request
# Parses a comma separated list of ids and returns a list of those requests
# prints any errors
ids = re.split(',\s*', line)
reqs = []
if not ids_only:
for reqid in ids:
try:
req = yield Request.load_request(reqid, allow_special)
reqs.append(req)
except PappyException as e:
print e
defer.returnValue(reqs)
else:
defer.returnValue(ids)
def print_table(coldata, rows):
"""
Print a table.
Coldata: List of dicts with info on how to print the columns.
``name`` is the heading to give column,
``width (optional)`` maximum width before truncating. 0 for unlimited.
Rows: List of tuples with the data to print
"""
# Get the width of each column
widths = []
headers = []
for data in coldata:
if 'name' in data:
headers.append(data['name'])
else:
headers.append('')
empty_headers = True
for h in headers:
if h != '':
empty_headers = False
if not empty_headers:
rows = [headers] + rows
for i in range(len(coldata)):
col = coldata[i]
if 'width' in col and col['width'] > 0:
maxwidth = col['width']
else:
maxwidth = 0
colwidth = 0
for row in rows:
printdata = row[i]
if isinstance(printdata, dict):
collen = len(str(printdata['data']))
else:
collen = len(str(printdata))
if collen > colwidth:
colwidth = collen
if maxwidth > 0 and colwidth > maxwidth:
widths.append(maxwidth)
else:
widths.append(colwidth)
# Print rows
padding = 2
is_heading = not empty_headers
for row in rows:
if is_heading:
sys.stdout.write(Styles.TABLE_HEADER)
for (col, width) in zip(row, widths):
if isinstance(col, dict):
printstr = str(col['data'])
if 'color' in col:
colors = col['color']
formatter = None
elif 'formatter' in col:
colors = None
formatter = col['formatter']
else:
colors = None
formatter = None
else:
printstr = str(col)
colors = None
formatter = None
if len(printstr) > width:
trunc_printstr=printstr[:width]
trunc_printstr=trunc_printstr[:-3]+'...'
else:
trunc_printstr=printstr
if colors is not None:
sys.stdout.write(colors)
sys.stdout.write(trunc_printstr)
sys.stdout.write(Colors.ENDC)
elif formatter is not None:
toprint = formatter(printstr, width)
sys.stdout.write(toprint)
else:
sys.stdout.write(trunc_printstr)
sys.stdout.write(' '*(width-len(printstr)))
sys.stdout.write(' '*padding)
if is_heading:
sys.stdout.write(Colors.ENDC)
is_heading = False
sys.stdout.write('\n')
sys.stdout.flush()
def print_requests(requests):
"""
Takes in a list of requests and prints a table with data on each of the
requests. It's the same table that's used by ``ls``.
"""
rows = []
for req in requests:
rows.append(get_req_data_row(req))
print_request_rows(rows)
def print_request_rows(request_rows):
"""
Takes in a list of request rows generated from :func:`pappyproxy.console.get_req_data_row`
and prints a table with data on each of the
requests. Used instead of :func:`pappyproxy.console.print_requests` if you
can't count on storing all the requests in memory at once.
"""
# Print a table with info on all the requests in the list
cols = [
{'name':'ID'},
{'name':'Verb'},
{'name': 'Host'},
{'name':'Path', 'width':40},
{'name':'S-Code', 'width':16},
{'name':'Req Len'},
{'name':'Rsp Len'},
{'name':'Time'},
{'name':'Mngl'},
]
print_rows = []
for row in request_rows:
(reqid, verb, host, path, scode, qlen, slen, time, mngl) = row
verb = {'data':verb, 'color':verb_color(verb)}
scode = {'data':scode, 'color':scode_color(scode)}
host = {'data':host, 'color':host_color(host)}
path = {'data':path, 'formatter':path_formatter}
print_rows.append((reqid, verb, host, path, scode, qlen, slen, time, mngl))
print_table(cols, print_rows)
def get_req_data_row(request):
"""
Get the row data for a request to be printed.
"""
rid = request.reqid
method = request.verb
if 'host' in request.headers:
host = request.headers['host']
else:
host = '??'
path = request.full_path
reqlen = len(request.body)
rsplen = 'N/A'
mangle_str = '--'
if request.unmangled:
mangle_str = 'q'
if request.response:
response_code = str(request.response.response_code) + \
' ' + request.response.response_text
rsplen = len(request.response.body)
if request.response.unmangled:
if mangle_str == '--':
mangle_str = 's'
else:
mangle_str += '/s'
else:
response_code = ''
time_str = '--'
if request.time_start and request.time_end:
time_delt = request.time_end - request.time_start
time_str = "%.2f" % time_delt.total_seconds()
return [rid, method, host, path, response_code,
reqlen, rsplen, time_str, mangle_str]
def confirm(message, default='n'):
"""
A helper function to get confirmation from the user. It prints ``message``
then asks the user to answer yes or no. Returns True if the user answers
yes, otherwise returns False.
"""
if 'n' in default.lower():
default = False
else:
default = True
print message
if default:
answer = raw_input('(Y/n) ')
else:
answer = raw_input('(y/N) ')
if not answer:
return default
if answer[0].lower() == 'y':
return True
else:
return False
##########
## Classes
@ -265,14 +36,16 @@ class ProxyCmd(cmd2.Cmd):
# the \x01/\x02 are to make the prompt behave properly with the readline library
self.prompt = 'pappy\x01' + Colors.YELLOW + '\x02> \x01' + Colors.ENDC + '\x02'
self.debug = True
self.session = kwargs['session']
del kwargs['session']
self._cmds = {}
self._aliases = {}
atexit.register(self.save_histfile)
readline.set_history_length(config.HISTSIZE)
readline.set_history_length(self.session.config.histsize)
if os.path.exists('cmdhistory'):
if config.HISTSIZE != 0:
if self.session.config.histsize != 0:
readline.read_history_file('cmdhistory')
else:
os.remove('cmdhistory')
@ -338,8 +111,8 @@ class ProxyCmd(cmd2.Cmd):
def save_histfile(self):
# Write the command to the history file
if config.HISTSIZE != 0:
readline.set_history_length(config.HISTSIZE)
if self.session.config.histsize != 0:
readline.set_history_length(self.session.config.histsize)
readline.write_history_file('cmdhistory')
def get_names(self):
@ -379,14 +152,3 @@ class ProxyCmd(cmd2.Cmd):
for command, alias in alias_list:
self.add_alias(command, alias)
# Taken from http://stackoverflow.com/questions/16571150/how-to-capture-stdout-output-from-a-python-function-call
# then modified
class Capturing():
def __enter__(self):
self._stdout = sys.stdout
sys.stdout = self._stringio = StringIO.StringIO()
return self
def __exit__(self, *args):
self.val = self._stringio.getvalue()
sys.stdout = self._stdout

@ -1,10 +1,8 @@
import crochet
import pappyproxy
import re
import shlex
from .http import Request, RepeatableDict
from .requestcache import RequestCache
from twisted.internet import defer
from util import PappyException
@ -100,99 +98,6 @@ class Context(object):
class FilterParseError(PappyException):
pass
class Filter(object):
"""
A class representing a filter. Its claim to fame is that you can use
:func:`pappyproxy.context.Filter.from_filter_string` to generate a
filter from a filter string.
"""
def __init__(self, filter_string):
self.filter_string = filter_string
def __call__(self, *args, **kwargs):
return self.filter_func(*args, **kwargs)
def __repr__(self):
return '<Filter "%s">' % self.filter_string
@defer.inlineCallbacks
def generate(self):
self.filter_func = yield self.from_filter_string(self.filter_string)
@staticmethod
@defer.inlineCallbacks
def from_filter_string(filter_string=None, parsed_args=None):
"""
from_filter_string(filter_string)
Create a filter from a filter string. If passed a list of arguments, they
will be used instead of parsing the string.
:rtype: Deferred that returns a :class:`pappyproxy.context.Filter`
"""
if parsed_args is not None:
args = parsed_args
else:
args = shlex.split(filter_string)
if len(args) == 0:
raise PappyException('Field is required')
field = args[0]
new_filter = None
field_args = args[1:]
if field in ("all",):
new_filter = gen_filter_by_all(field_args)
elif field in ("host", "domain", "hs", "dm"):
new_filter = gen_filter_by_host(field_args)
elif field in ("path", "pt"):
new_filter = gen_filter_by_path(field_args)
elif field in ("body", "bd", "data", "dt"):
new_filter = gen_filter_by_body(field_args)
elif field in ("reqbody", "qbd", "reqdata", "qdt"):
new_filter = gen_filter_by_req_body(field_args)
elif field in ("rspbody", "sbd", "qspdata", "sdt"):
new_filter = gen_filter_by_rsp_body(field_args)
elif field in ("verb", "vb"):
new_filter = gen_filter_by_verb(field_args)
elif field in ("param", "pm"):
new_filter = gen_filter_by_params(field_args)
elif field in ("header", "hd"):
new_filter = gen_filter_by_headers(field_args)
elif field in ("reqheader", "qhd"):
new_filter = gen_filter_by_request_headers(field_args)
elif field in ("rspheader", "shd"):
new_filter = gen_filter_by_response_headers(field_args)
elif field in ("rawheaders", "rh"):
new_filter = gen_filter_by_raw_headers(field_args)
elif field in ("sentcookie", "sck"):
new_filter = gen_filter_by_submitted_cookies(field_args)
elif field in ("setcookie", "stck"):
new_filter = gen_filter_by_set_cookies(field_args)
elif field in ("statuscode", "sc", "responsecode"):
new_filter = gen_filter_by_response_code(field_args)
elif field in ("responsetime", "rt"):
raise PappyException('Not implemented yet, sorry!')
elif field in ("tag", "tg"):
new_filter = gen_filter_by_tag(field_args)
elif field in ("saved", "svd"):
new_filter = gen_filter_by_saved(field_args)
elif field in ("before", "b4", "bf"):
new_filter = yield gen_filter_by_before(field_args)
elif field in ("after", "af"):
new_filter = yield gen_filter_by_after(field_args)
elif field in ("inv",):
new_filter = yield gen_filter_by_inverse(field_args)
else:
raise FilterParseError("%s is not a valid field" % field)
if new_filter is None:
raise FilterParseError("Error creating filter")
# dirty hack to get it to work if we don't generate any deferreds
# d = defer.Deferred()
# d.callback(None)
# yield d
defer.returnValue(new_filter)
def cmp_is(a, b):
if a is None or b is None:
@ -688,3 +593,127 @@ def reset_context_caches():
import pappyproxy.pappy
for c in pappyproxy.pappy.all_contexts:
c.cache_reset()
class Filter(object):
"""
A class representing a filter. Its claim to fame is that you can use
:func:`pappyproxy.context.Filter.from_filter_string` to generate a
filter from a filter string.
"""
_filter_functions = {
"all": gen_filter_by_all,
"host": gen_filter_by_host,
"domain": gen_filter_by_host,
"hs": gen_filter_by_host,
"dm": gen_filter_by_host,
"path": gen_filter_by_path,
"pt": gen_filter_by_path,
"body": gen_filter_by_body,
"bd": gen_filter_by_body,
"data": gen_filter_by_body,
"dt": gen_filter_by_body,
"reqbody": gen_filter_by_req_body,
"qbd": gen_filter_by_req_body,
"reqdata": gen_filter_by_req_body,
"qdt": gen_filter_by_req_body,
"rspbody": gen_filter_by_rsp_body,
"sbd": gen_filter_by_rsp_body,
"qspdata": gen_filter_by_rsp_body,
"sdt": gen_filter_by_rsp_body,
"verb": gen_filter_by_verb,
"vb": gen_filter_by_verb,
"param": gen_filter_by_params,
"pm": gen_filter_by_params,
"header": gen_filter_by_headers,
"hd": gen_filter_by_headers,
"reqheader": gen_filter_by_request_headers,
"qhd": gen_filter_by_request_headers,
"rspheader": gen_filter_by_response_headers,
"shd": gen_filter_by_response_headers,
"rawheaders": gen_filter_by_raw_headers,
"rh": gen_filter_by_raw_headers,
"sentcookie": gen_filter_by_submitted_cookies,
"sck": gen_filter_by_submitted_cookies,
"setcookie": gen_filter_by_set_cookies,
"stck": gen_filter_by_set_cookies,
"statuscode": gen_filter_by_response_code,
"sc": gen_filter_by_response_code,
"responsecode": gen_filter_by_response_code,
"tag": gen_filter_by_tag,
"tg": gen_filter_by_tag,
"saved": gen_filter_by_saved,
"svd": gen_filter_by_saved,
}
_async_filter_functions = {
"before": gen_filter_by_before,
"b4": gen_filter_by_before,
"bf": gen_filter_by_before,
"after": gen_filter_by_after,
"af": gen_filter_by_after,
"inv": gen_filter_by_inverse,
}
def __init__(self, filter_string):
self.filter_string = filter_string
def __call__(self, *args, **kwargs):
return self.filter_func(*args, **kwargs)
def __repr__(self):
return '<Filter "%s">' % self.filter_string
@defer.inlineCallbacks
def generate(self):
self.filter_func = yield self.from_filter_string(self.filter_string)
@staticmethod
@defer.inlineCallbacks
def from_filter_string(filter_string=None, parsed_args=None):
"""
from_filter_string(filter_string)
Create a filter from a filter string. If passed a list of arguments, they
will be used instead of parsing the string.
:rtype: Deferred that returns a :class:`pappyproxy.context.Filter`
"""
if parsed_args is not None:
args = parsed_args
else:
args = shlex.split(filter_string)
if len(args) == 0:
raise PappyException('Field is required')
field = args[0]
new_filter = None
field_args = args[1:]
if field in Filter._filter_functions:
new_filter = Filter._filter_functions[field](field_args)
elif field in Filter._async_filter_functions:
new_filter = yield Filter._async_filter_functions[field](field_args)
else:
raise FilterParseError("%s is not a valid field" % field)
if new_filter is None:
raise FilterParseError("Error creating filter")
defer.returnValue(new_filter)

@ -17,14 +17,16 @@ from .requestcache import RequestCache
from .colors import Colors, host_color, path_formatter
from pygments.formatters import TerminalFormatter
from pygments.lexers import get_lexer_for_mimetype, HttpLexer
from twisted.internet import defer, reactor
import sys
from twisted.internet import defer
ENCODE_NONE = 0
ENCODE_DEFLATE = 1
ENCODE_GZIP = 2
PATH_RELATIVE = 0
PATH_ABSOLUTE = 1
PATH_HOST = 2
dbpool = None
def init(pool):
@ -535,7 +537,11 @@ class HTTPMessage(object):
:ivar start_line: The start line of the message
:vartype start_line: string
"""
reserved_meta_keys = ['full_message']
"""
Internal class variable. Do not modify.
"""
def __init__(self, full_message=None, update_content_length=False):
# Initializes instance variables too
@ -577,6 +583,8 @@ class HTTPMessage(object):
def deepcopy(self):
"""
Returns a deep copy of the message. Implemented by child.
NOINDEX
"""
return self.__deepcopy__()
@ -795,6 +803,8 @@ class HTTPMessage(object):
:type line: string
:param key: Header value
:type line: string
NOINDEX
"""
if val is None:
return True
@ -834,23 +844,29 @@ class HTTPMessage(object):
def handle_start_line(self, start_line):
"""
A handler function for the status line.
NOINDEX
"""
self.start_line = start_line
def headers_end(self):
"""
Called when the headers are complete.
NOINDEX
"""
pass
def body_complete(self):
"""
Called when the body of the message is complete
NOINDEX
"""
try:
self.body = _decode_encoded(self._data_obj.body,
self._encoding_type)
except IOError as e:
except IOError:
# Screw handling it gracefully, this is the server's fault.
print 'Error decoding request, storing raw data in body instead'
self.body = self._data_obj.body
@ -859,6 +875,8 @@ class HTTPMessage(object):
"""
Called when the body of the message is modified directly. Should be used
to update metadata that depends on the body of the message.
NOINDEX
"""
if len(self.body) > 0 or 'Content-Length' in self.headers:
self.headers.update('Content-Length', str(len(self.body)), do_callback=False)
@ -867,6 +885,8 @@ class HTTPMessage(object):
"""
Called when a header is modified. Should be used to update metadata that
depends on the values of headers.
NOINDEX
"""
pass
@ -882,6 +902,8 @@ class HTTPMessage(object):
Get all the metadata of the message in dictionary form.
Should be implemented in child class.
Should not be invoked outside of implementation!
NOINDEX
"""
pass
@ -893,6 +915,8 @@ class HTTPMessage(object):
:param data: Metadata to apply
:type line: dict
NOINDEX
"""
pass
@ -900,6 +924,8 @@ class HTTPMessage(object):
"""
Reset meta values to default values. Overridden by child class.
Should not be invoked outside of implementation!
NOINDEX
"""
pass
@ -978,6 +1004,9 @@ class Request(HTTPMessage):
:vartype tags: List of Strings
:ivar plugin_data: Data about the request created by plugins. If you modify this, please add your own key to it for your plugin and store all your plugin's data under that key (probably as another dict). For example if you have a plugin called ``foo``, try and store all your data under ``req.plugin_data['foo']``.
:vartype plugin_data: Dict
:ivar path_type: An enum which describes how the path portion of the request should be represented. ``PATH_RELATIVE`` -> normal relative path, ``PATH_ABSOLUTE`` -> The absolute path (including the protocol), ``PATH_HOST`` -> Just the path and the port (Used for CONNECT requests when connecting to an upstream HTTP proxy).
:vartype path_type: Enum
:ivar explicit_port: A flag to indicate that the port should always be included in the URL
"""
cache = RequestCache(100)
@ -986,7 +1015,8 @@ class Request(HTTPMessage):
"""
def __init__(self, full_request=None, update_content_length=True,
port=None, is_ssl=None, host=None):
port=None, is_ssl=None, host=None, path_type=None,
proxy_creds=None, explicit_port=False):
# Resets instance variables
self.clear()
@ -1007,6 +1037,10 @@ class Request(HTTPMessage):
self.port = port
if host:
self._host = host
if path_type:
self.path_type = path_type
if explicit_port:
self.explicit_port = explicit_port
def __copy__(self):
if not self.complete:
@ -1046,7 +1080,13 @@ class Request(HTTPMessage):
"""
if not self.verb and not self.full_path and not self.version:
return ''
return '%s %s %s' % (self.verb, self.full_path, self.version)
if self.path_type == PATH_ABSOLUTE:
path = self._url_helper(always_have_path=True)
elif self.path_type == PATH_HOST:
path = ':'.join((self.host, str(self.port)))
else:
path = self.full_path
return '%s %s %s' % (self.verb, path, self.version)
@start_line.setter
def start_line(self, val):
@ -1126,8 +1166,65 @@ class Request(HTTPMessage):
@raw_data.setter
def raw_data(self, val):
self.body = val
@property
def connect_request(self):
"""
If the request uses SSL, this will be a request object that can be used
with an upstream HTTP server to connect to a server using SSL
"""
if not self.is_ssl:
return None
ret = Request()
ret.status_line = self.status_line
ret.host = self.host
ret.port = self.port
ret.explicit_port = True
ret.path_type = PATH_HOST
authu, authp = self.proxy_creds
ret.verb = 'CONNECT'
if authu and authp:
ret.proxy_creds = self.proxy_creds
return ret
@property
def proxy_creds(self):
"""
A username/password tuple representing the username/password to
authenticate to a proxy server. Sets the ``Proxy-Authorization``
header. Getter will return (None, None) if no creds exist
:getter: Returns the username/password tuple used for proxy authorization
:setter: Sets the username/password tuple used for proxy authorization
:type: Tuple of two strings: (username, password)
"""
if not 'Proxy-Authorization' in self.headers:
return (None, None)
return Request._parse_basic_auth(self.headers['Proxy-Authorization'])
@proxy_creds.setter
def proxy_creds(self, creds):
username, password = creds
self.headers['Proxy-Authorization'] = Request._encode_basic_auth(username, password)
@staticmethod
def _parse_basic_auth(header):
"""
Parse a raw basic auth header and return (username, password)
"""
_, creds = header.split(' ', 1)
decoded = base64.b64decode(creds)
username, password = decoded.split(':', 1)
return (username, password)
@staticmethod
def _encode_basic_auth(username, password):
decoded = '%s:%s' % (username, password)
encoded = base64.b64encode(decoded)
header = 'Basic %s' % encoded
return header
def _url_helper(self, colored=False):
def _url_helper(self, colored=False, always_have_path=False):
retstr = ''
if self.is_ssl:
retstr += 'https://'
@ -1146,7 +1243,8 @@ class Request(HTTPMessage):
else:
retstr += self.host
if not ((self.is_ssl and self.port == 443) or \
(not self.is_ssl and self.port == 80)):
(not self.is_ssl and self.port == 80) or \
self.explicit_port):
if colored:
retstr += ':'
retstr += Colors.MAGENTA
@ -1154,7 +1252,7 @@ class Request(HTTPMessage):
retstr += Colors.ENDC
else:
retstr += ':%d' % self.port
if self.path and self.path != '/':
if (self.path and self.path != '/') or always_have_path:
if colored:
retstr += path_formatter(self.path)
else:
@ -1343,6 +1441,8 @@ class Request(HTTPMessage):
self.plugin_data = {}
self.reset_metadata()
self.is_unmangled_version = False
self.path_type = PATH_RELATIVE
self.explicit_port = False
############################
## Internal update functions
@ -1531,7 +1631,6 @@ class Request(HTTPMessage):
:rtype: twisted.internet.defer.Deferred
"""
from .context import Context
from .pappy import main_context
global dbpool
@ -1740,7 +1839,7 @@ class Request(HTTPMessage):
@defer.inlineCallbacks
def delete(self, cust_dbpool=None, cust_cache=None):
from .context import Context, reset_context_caches
from .context import reset_context_caches
global dbpool
if cust_dbpool:
@ -1814,12 +1913,11 @@ class Request(HTTPMessage):
from .http import Request
global dbpool
if cust_dbpool:
use_dbpool = cust_dbpool
use_cache = cust_cache
else:
use_dbpool = dbpool
use_cache = Request.cache
req = Request(row[0])
if row[1]:
@ -1871,7 +1969,6 @@ class Request(HTTPMessage):
:rtype: twisted.internet.defer.Deferred
"""
from .requestcache import RequestCache
from .http import Request
global dbpool
@ -1922,10 +2019,8 @@ class Request(HTTPMessage):
global dbpool
if cust_dbpool:
use_dbpool = cust_dbpool
use_cache = cust_cache
else:
use_dbpool = dbpool
use_cache = Request.cache
# tags
rows = yield use_dbpool.runQuery(
@ -1959,9 +2054,8 @@ class Request(HTTPMessage):
:rtype: twisted.internet.defer.Deferred
"""
from .context import Context
global dbpool
if cust_dbpool:
use_dbpool = cust_dbpool
cache_to_use = cust_cache
@ -2051,8 +2145,8 @@ class Request(HTTPMessage):
:type full_request: string
:rtype: Twisted deferred that calls back with a Request
"""
from .proxy import ProxyClientFactory, get_next_connection_id, ClientTLSContext, get_endpoint
from .config import SOCKS_PROXY
from .proxy import ProxyClientFactory, get_next_connection_id, get_endpoint
from .pappy import session
new_req = Request(full_request)
new_req.is_ssl = is_ssl
@ -2064,7 +2158,7 @@ class Request(HTTPMessage):
factory.connection_id = get_next_connection_id()
yield factory.prepare_request()
endpoint = get_endpoint(host, port, is_ssl,
socks_config=SOCKS_PROXY)
socks_config=session.config.socks_proxy)
yield endpoint.connect(factory)
new_req = yield factory.data_defer
defer.returnValue(new_req)
@ -2161,7 +2255,10 @@ class Response(HTTPMessage):
"""
if not self.version and self.response_code == 0 and not self.version:
return ''
return '%s %d %s' % (self.version, self.response_code, self.response_text)
if self.response_text == '':
return '%s %d' % (self.version, self.response_code)
else:
return '%s %d %s' % (self.version, self.response_code, self.response_text)
@start_line.setter
def start_line(self, val):
@ -2301,8 +2398,12 @@ class Response(HTTPMessage):
self.response_text = ''
return
self._first_line = False
self.version, self.response_code, self.response_text = \
start_line.split(' ', 2)
if len(start_line.split(' ')) > 2:
self.version, self.response_code, self.response_text = \
start_line.split(' ', 2)
else:
self.version, self.response_code = start_line.split(' ', 1)
self.response_text = ''
self.response_code = int(self.response_code)
if self.response_code == 304 or self.response_code == 204 or \
@ -2376,10 +2477,8 @@ class Response(HTTPMessage):
global dbpool
if cust_dbpool:
use_dbpool = cust_dbpool
use_cache = cust_cache
else:
use_dbpool = dbpool
use_cache = Request.cache
assert(use_dbpool)
try:
# Check for intyness
@ -2435,7 +2534,7 @@ class Response(HTTPMessage):
@defer.inlineCallbacks
def delete(self):
if self.rspid is not None:
row = yield dbpool.runQuery(
yield dbpool.runQuery(
"""
DELETE FROM responses WHERE id=?;
""",
@ -2454,10 +2553,8 @@ class Response(HTTPMessage):
global dbpool
if cust_dbpool:
use_dbpool = cust_dbpool
use_cache = cust_cache
else:
use_dbpool = dbpool
use_cache = Request.cache
assert(use_dbpool)
rows = yield use_dbpool.runQuery(

@ -1,11 +1,11 @@
import os
from .config import PAPPY_DIR
from .pappy import session
def from_file(fname, intro=False):
# Ignores lines until the first blank line, then returns every non-blank
# line afterwards
full_fname = os.path.join(PAPPY_DIR, 'lists', fname)
full_fname = os.path.join(session.config.pappy_dir, 'lists', fname)
with open(full_fname, 'r') as f:
d = f.read()
lines = d.splitlines()

@ -6,7 +6,7 @@ import re
import stat
from jinja2 import Environment, FileSystemLoader
from pappyproxy import config
from pappyproxy.pappy import session
from pappyproxy.util import PappyException
from twisted.internet import defer
@ -279,7 +279,7 @@ def macro_from_requests(reqs, short_name='', long_name=''):
subs['req_lines'] = req_lines
subs['req_params'] = req_params
loader = FileSystemLoader(config.PAPPY_DIR+'/templates')
loader = FileSystemLoader(session.config.pappy_dir+'/templates')
env = Environment(loader=loader)
template = env.get_template('macro.py.template')
return template.render(zip=zip, **subs)
@ -294,7 +294,7 @@ def gen_imacro(short_name='', long_name=''):
subs['short_name'] = short_name
loader = FileSystemLoader(config.PAPPY_DIR+'/templates')
loader = FileSystemLoader(session.config.pappy_dir+'/templates')
env = Environment(loader=loader)
template = env.get_template('intmacro.py.template')
return template.render(**subs)

@ -1,23 +1,28 @@
#!/usr/bin/env python2
"""
Handles the main Pappy session.
.. data:: session
The :class:`pappyproxy.pappy.PappySession` object for the current session. Mainly
used for accessing the session's config information.
"""
import argparse
import crochet
import datetime
import os
import schema.update
import shutil
import signal
import sys
import tempfile
import signal
from . import comm
from . import config
from . import context
from . import http
from . import plugin
from . import proxy
from . import requestcache
from . import util
from .console import ProxyCmd
from twisted.enterprise import adbapi
from twisted.internet import reactor, defer
@ -26,11 +31,11 @@ from twisted.internet.protocol import ServerFactory
from twisted.internet.threads import deferToThread
crochet.no_setup()
server_factories = []
main_context = context.Context()
all_contexts = [main_context]
plugin_loader = None
cons = None
session = None
quit_confirm_time = None
try:
from guppy import hpy
@ -39,7 +44,110 @@ try:
except ImportError:
heapstats = None
class PappySession(object):
"""
An object representing a pappy session. Mainly you'll only use this to get to
the session config.
:ivar config: The configuration settings for the session
:vartype config: :class:`pappyproxy.config.PappyConfig`
"""
def __init__(self, sessconfig):
self.config = sessconfig
self.complete_defer = defer.Deferred()
self.server_factories = []
self.plugin_loader = None
self.cons = None
self.dbpool = None
self.delete_data_on_quit = False
self.ports = None
@defer.inlineCallbacks
def start(self):
from . import proxy, plugin
# If the data file doesn't exist, create it with restricted permissions
if not os.path.isfile(self.config.datafile):
with os.fdopen(os.open(self.config.datafile, os.O_CREAT, 0o0600), 'r'):
pass
self.dbpool = adbapi.ConnectionPool("sqlite3", self.config.datafile,
check_same_thread=False,
cp_openfun=set_text_factory,
cp_max=1)
try:
yield schema.update.update_schema(self.dbpool, self.config.datafile)
except Exception as e:
print 'Error updating schema: %s' % e
print 'Exiting...'
self.complete_defer.callback(None)
return
http.init(self.dbpool)
yield http.Request.cache.load_ids()
context.reset_context_caches()
# Run the proxy
if self.config.debug_dir and os.path.exists(self.config.debug_dir):
shutil.rmtree(self.config.debug_dir)
print 'Removing old debugging output'
listen_strs = []
self.ports = []
for listener in self.config.listeners:
server_factory = proxy.ProxyServerFactory(save_all=True)
try:
if 'forward_host_ssl' in listener and listener['forward_host_ssl']:
server_factory.force_ssl = True
server_factory.forward_host = listener['forward_host_ssl']
elif 'forward_host' in listener and listener['forward_host']:
server_factory.force_ssl = False
server_factory.forward_host = listener['forward_host']
port = reactor.listenTCP(listener['port'], server_factory, interface=listener['interface'])
listener_str = 'port %d' % listener['port']
if listener['interface'] not in ('127.0.0.1', 'localhost'):
listener_str += ' (bound to %s)' % listener['interface']
listen_strs.append(listener_str)
self.ports.append(port)
self.server_factories.append(server_factory)
except CannotListenError as e:
print repr(e)
if listen_strs:
print 'Proxy is listening on %s' % (', '.join(listen_strs))
else:
print 'No listeners opened'
com_factory = ServerFactory()
com_factory.protocol = comm.CommServer
# Make the port different for every instance of pappy, then pass it to
# anything we run. Otherwise we can only have it running once on a machine
self.comm_port = reactor.listenTCP(0, com_factory, interface='127.0.0.1')
self.comm_port = self.comm_port.getHost().port
# Load the scope
yield context.load_scope(self.dbpool)
context.reset_to_scope(main_context)
sys.argv = [sys.argv[0]] # cmd2 tries to parse args
self.cons = ProxyCmd(session=session)
self.plugin_loader = plugin.PluginLoader(self.cons)
for d in self.config.plugin_dirs:
if not os.path.exists(d):
os.makedirs(d)
self.plugin_loader.load_directory(d)
# Add cleanup to defer
self.complete_defer = deferToThread(self.cons.cmdloop)
self.complete_defer.addCallback(self.cleanup)
@defer.inlineCallbacks
def cleanup(self, ignored=None):
for port in self.ports:
yield port.stopListening()
if self.delete_data_on_quit:
print 'Deleting temporary datafile'
os.remove(self.config.datafile)
def parse_args():
# parses sys.argv and returns a settings dictionary
@ -59,122 +167,68 @@ def parse_args():
def set_text_factory(conn):
conn.text_factory = str
def delete_datafile():
print 'Deleting temporary datafile'
os.remove(config.DATAFILE)
def custom_int_handler(signum, frame):
# sorry
print "Sorry, we can't kill things partway through otherwise the data file might be left in a corrupt state"
@defer.inlineCallbacks
def main():
global server_factories
global plugin_loader
global cons
settings = parse_args()
global session
try:
settings = parse_args()
except SystemExit:
print 'Did you mean to just start the console? If so, just run `pappy` without any arguments then enter commands into the prompt that appears.'
reactor.stop()
defer.returnValue(None)
pappy_config = config.PappyConfig()
if not os.path.exists(pappy_config.data_dir):
os.makedirs(pappy_config.data_dir)
if not os.path.exists(config.DATA_DIR):
os.makedirs(config.DATA_DIR)
session = PappySession(pappy_config)
signal.signal(signal.SIGINT, inturrupt_handler)
if settings['lite']:
conf_settings = config.get_default_config()
conf_settings = pappy_config.get_default_config()
conf_settings['debug_dir'] = None
conf_settings['debug_to_file'] = False
conf_settings['history_size'] = 0
with tempfile.NamedTemporaryFile(delete=False) as tf:
conf_settings['data_file'] = tf.name
print 'Temporary datafile is %s' % tf.name
delete_data_on_quit = True
config.load_settings(conf_settings)
session.delete_data_on_quit = True
pappy_config.load_settings(conf_settings)
else:
# Initialize config
config.load_from_file('./config.json')
config.global_load_from_file()
delete_data_on_quit = False
# If the data file doesn't exist, create it with restricted permissions
if not os.path.isfile(config.DATAFILE):
with os.fdopen(os.open(config.DATAFILE, os.O_CREAT, 0o0600), 'r') as f:
pass
dbpool = adbapi.ConnectionPool("sqlite3", config.DATAFILE,
check_same_thread=False,
cp_openfun=set_text_factory,
cp_max=1)
try:
yield schema.update.update_schema(dbpool, config.DATAFILE)
except Exception as e:
print 'Error updating schema: %s' % e
print 'Exiting...'
reactor.stop()
http.init(dbpool)
yield http.Request.cache.load_ids()
context.reset_context_caches()
# Run the proxy
if config.DEBUG_DIR and os.path.exists(config.DEBUG_DIR):
shutil.rmtree(config.DEBUG_DIR)
print 'Removing old debugging output'
listen_strs = []
ports = []
for listener in config.LISTENERS:
server_factory = proxy.ProxyServerFactory(save_all=True)
try:
if 'forward_host_ssl' in listener and listener['forward_host_ssl']:
server_factory.force_ssl = True
server_factory.forward_host = listener['forward_host_ssl']
elif 'forward_host' in listener and listener['forward_host']:
server_factory.force_ssl = False
server_factory.forward_host = listener['forward_host']
port = reactor.listenTCP(listener['port'], server_factory, interface=listener['interface'])
listener_str = 'port %d' % listener['port']
if listener['interface'] not in ('127.0.0.1', 'localhost'):
listener_str += ' (bound to %s)' % listener['interface']
listen_strs.append(listener_str)
ports.append(port)
server_factories.append(server_factory)
except CannotListenError as e:
print repr(e)
if listen_strs:
print 'Proxy is listening on %s' % (', '.join(listen_strs))
else:
print 'No listeners opened'
com_factory = ServerFactory()
com_factory.protocol = comm.CommServer
# Make the port different for every instance of pappy, then pass it to
# anything we run. Otherwise we can only have it running once on a machine
comm_port = reactor.listenTCP(0, com_factory, interface='127.0.0.1')
comm.set_comm_port(comm_port.getHost().port)
# Load the scope
yield context.load_scope(http.dbpool)
context.reset_to_scope(main_context)
sys.argv = [sys.argv[0]] # cmd2 tries to parse args
cons = ProxyCmd()
plugin_loader = plugin.PluginLoader(cons)
for d in config.PLUGIN_DIRS:
if not os.path.exists(d):
os.makedirs(d)
plugin_loader.load_directory(d)
pappy_config.load_from_file('./config.json')
pappy_config.global_load_from_file()
session.delete_data_on_quit = False
@defer.inlineCallbacks
def close_listeners(ignored):
for port in ports:
yield port.stopListening()
yield session.start()
d = deferToThread(cons.cmdloop)
d.addCallback(close_listeners)
d.addCallback(lambda ignored: reactor.stop())
if delete_data_on_quit:
d.addCallback(lambda ignored: delete_datafile())
session.complete_defer.addCallback(lambda ignored: reactor.stop())
def start():
reactor.callWhenRunning(main)
reactor.run()
def inturrupt_handler(signal, frame):
global session
global quit_confirm_time
if not quit_confirm_time or datetime.datetime.now() > quit_confirm_time:
print ''
print ('Inturrupting will cause Pappy to quit completely. This will '
'cause any in-memory only requests to be lost, but all other '
'data will be saved.')
print ('Inturrupt a second time to confirm.')
print ''
quit_confirm_time = datetime.datetime.now() + datetime.timedelta(0, 10)
else:
d = session.cleanup()
d.addCallback(lambda _: reactor.stop())
d.addCallback(lambda _: os._exit(1)) # Sorry blocking threads :(
if __name__ == '__main__':
start()

@ -16,8 +16,6 @@ from .proxy import remove_intercepting_macro as proxy_remove_intercepting_macro
from .colors import Colors
from .util import PappyException
from twisted.internet import defer
class Plugin(object):
def __init__(self, cmd, fname=None):
@ -94,7 +92,7 @@ def add_intercepting_macro(name, macro):
only use this if you may need to modify messages before they are
passed along.
"""
for factory in pappyproxy.pappy.server_factories:
for factory in pappyproxy.pappy.session.server_factories:
proxy_add_intercepting_macro(name, macro, factory.intercepting_macros)
def remove_intercepting_macro(name):
@ -104,7 +102,7 @@ def remove_intercepting_macro(name):
:func:`pappyproxy.plugin.add_intercepting_macro` to identify which
macro you would like to stop.
"""
for factory in pappyproxy.pappy.server_factories:
for factory in pappyproxy.pappy.session.server_factories:
proxy_remove_intercepting_macro(name, factory.intercepting_macros)
def active_intercepting_macros():
@ -113,7 +111,7 @@ def active_intercepting_macros():
this list will not affect which macros are active.
"""
ret = []
for factory in pappyproxy.pappy.server_factories:
for factory in pappyproxy.pappy.session.server_factories:
ret += [v for k, v in factory.intercepting_macros.iteritems() ]
return ret
@ -136,15 +134,14 @@ def req_history(num=-1, ids=None, include_unmangled=False):
``include_unmangled`` is True, then the iterator will include
requests which are the unmangled version of other requests.
An example of using the iterator to print the 10 most recent requests:
```
@defer.inlineCallbacks
def find_food():
for req_d in req_history(10):
req = yield req_d
print '-'*10
print req.full_message_pretty
```
An example of using the iterator to print the 10 most recent requests::
@defer.inlineCallbacks
def find_food():
for req_d in req_history(10):
req = yield req_d
print '-'*10
print req.full_message_pretty
"""
return pappyproxy.Request.cache.req_it(num=num, ids=ids, include_unmangled=include_unmangled)

@ -9,8 +9,8 @@ import datetime
from pappyproxy.http import Request, post_request
from pappyproxy.util import PappyException
from pappyproxy.requestcache import RequestCache
from pappyproxy.console import print_requests
from pappyproxy.pappy import heapstats, cons
from pappyproxy.util import print_requests
from pappyproxy.pappy import heapstats, session
from pappyproxy.plugin import require_modules
from twisted.internet import defer
@ -97,7 +97,7 @@ def big_fucking_data_file(line):
def time_cmd(line):
print 'Timing `%s`...' % line
start = datetime.datetime.now()
cons.onecmd(line.strip())
session.cons.onecmd(line.strip())
end = datetime.datetime.now()
total_time = (end-start).total_seconds()
print '`{0}` took {1:.3f} seconds'.format(line, total_time)

@ -1,8 +1,7 @@
import crochet
import pappyproxy
from pappyproxy.console import confirm
from pappyproxy.util import PappyException
from pappyproxy.util import PappyException, confirm
from pappyproxy.http import Request
from twisted.internet import defer

@ -3,9 +3,8 @@ import pappyproxy
import shlex
from pappyproxy.plugin import active_intercepting_macros, add_intercepting_macro, remove_intercepting_macro
from pappyproxy.console import load_reqlist
from pappyproxy.macros import load_macros, macro_from_requests, gen_imacro
from pappyproxy.util import PappyException
from pappyproxy.util import PappyException, load_reqlist
from twisted.internet import defer
loaded_macros = []

@ -10,7 +10,7 @@ from pappyproxy.util import PappyException
from pappyproxy.macros import InterceptMacro
from pappyproxy.http import Request, Response
from pappyproxy.plugin import add_intercepting_macro, remove_intercepting_macro
from pappyproxy import comm, config
from pappyproxy import pappy
from twisted.internet import defer
PLUGIN_ID="manglecmds"
@ -126,8 +126,8 @@ def check_reqid(reqid):
defer.returnValue(None)
def start_editor(reqid):
script_loc = os.path.join(config.PAPPY_DIR, "plugins", "vim_repeater", "repeater.vim")
subprocess.call(["vim", "-S", script_loc, "-c", "RepeaterSetup %s %d"%(reqid, comm.comm_port)])
script_loc = os.path.join(pappy.session.config.pappy_dir, "plugins", "vim_repeater", "repeater.vim")
subprocess.call(["vim", "-S", script_loc, "-c", "RepeaterSetup %s %d"%(reqid, pappy.session.comm_port)])
####################
## Command functions
@ -163,6 +163,8 @@ def intercept(line):
intercept_requests = True
if any(a in rsp_names for a in args):
intercept_responses = True
if not args:
intercept_requests = True
if intercept_requests and intercept_responses:
intercept_str = 'Requests and responses'

@ -3,11 +3,10 @@ import pappyproxy
import shlex
from pappyproxy.colors import Colors, Styles, path_formatter, host_color, scode_color, verb_color
from pappyproxy.console import confirm, load_reqlist, Capturing
from pappyproxy.util import PappyException, remove_color
from pappyproxy.util import PappyException, remove_color, confirm, load_reqlist, Capturing
from pappyproxy.macros import InterceptMacro
from pappyproxy.requestcache import RequestCache
from pappyproxy.pappy import cons
from pappyproxy.pappy import session
from pappyproxy.plugin import add_intercepting_macro, remove_intercepting_macro
from twisted.internet import defer
from twisted.enterprise import adbapi
@ -76,7 +75,7 @@ def gencerts(line):
Generate CA cert and private CA file
Usage: gencerts [/path/to/put/certs/in]
"""
dest_dir = line or pappyproxy.config.CERT_DIR
dest_dir = line or pappyproxy.pappy.session.config.cert_dir
message = "This will overwrite any existing certs in %s. Are you sure?" % dest_dir
if not confirm(message, 'n'):
return False
@ -94,9 +93,9 @@ def log(line):
verbosity = int(line.strip())
except:
verbosity = 1
pappyproxy.config.DEBUG_VERBOSITY = verbosity
pappyproxy.pappy.session.config.debug_verbosity = verbosity
raw_input()
pappyproxy.config.DEBUG_VERBOSITY = 0
pappyproxy.pappy.session.config.debug_verbosity = 0
@crochet.wait_for(timeout=None)
@defer.inlineCallbacks
@ -182,7 +181,7 @@ def watch_proxy(line):
def run_without_color(line):
with Capturing() as output:
cons.onecmd(line.strip())
session.cons.onecmd(line.strip())
print remove_color(output.val)
def load_cmds(cmd):

@ -3,8 +3,7 @@ import pappyproxy
import shlex
from pappyproxy.plugin import main_context_ids
from pappyproxy.console import load_reqlist
from pappyproxy.util import PappyException
from pappyproxy.util import PappyException, load_reqlist
from twisted.internet import defer
from pappyproxy.http import Request

@ -7,8 +7,7 @@ import pprint
import shlex
import urllib
from pappyproxy.console import load_reqlist, print_table, print_request_rows, get_req_data_row
from pappyproxy.util import PappyException, utc2local
from pappyproxy.util import PappyException, utc2local, load_reqlist, print_table, print_request_rows, get_req_data_row
from pappyproxy.http import Request, repeatable_parse_qs
from twisted.internet import defer
from pappyproxy.plugin import main_context_ids
@ -270,6 +269,8 @@ def view_request_info(line):
Usage: view_request_info <reqid(s)>
"""
args = shlex.split(line)
if not args:
raise PappyException("Request id is required")
reqids = args[0]
reqs = yield load_reqlist(reqids)
@ -287,6 +288,8 @@ def view_request_headers(line):
Usage: view_request_headers <reqid(s)>
"""
args = shlex.split(line)
if not args:
raise PappyException("Request id is required")
reqid = args[0]
reqs = yield load_reqlist(reqid)
@ -307,6 +310,8 @@ def view_full_request(line):
Usage: view_full_request <reqid(s)>
"""
args = shlex.split(line)
if not args:
raise PappyException("Request id is required")
reqid = args[0]
reqs = yield load_reqlist(reqid)
@ -326,6 +331,8 @@ def view_request_bytes(line):
Usage: view_request_bytes <reqid(s)>
"""
args = shlex.split(line)
if not args:
raise PappyException("Request id is required")
reqid = args[0]
reqs = yield load_reqlist(reqid)

@ -6,7 +6,6 @@ import random
from OpenSSL import SSL
from OpenSSL import crypto
from pappyproxy import config
from pappyproxy import context
from pappyproxy import http
from pappyproxy import macros
@ -37,35 +36,37 @@ def remove_intercepting_macro(key, int_macro_dict):
del int_macro_dict[key]
def log(message, id=None, symbol='*', verbosity_level=1):
if config.DEBUG_TO_FILE or config.DEBUG_VERBOSITY > 0:
if config.DEBUG_TO_FILE and not os.path.exists(config.DEBUG_DIR):
os.makedirs(config.DEBUG_DIR)
from pappyproxy.pappy import session
if session.config.debug_to_file or session.config.debug_verbosity > 0:
if session.config.debug_to_file and not os.path.exists(session.config.debug_dir):
os.makedirs(session.config.debug_dir)
if id:
debug_str = '[%s](%d) %s' % (symbol, id, message)
if config.DEBUG_TO_FILE:
with open(config.DEBUG_DIR+'/connection_%d.log' % id, 'a') as f:
if session.config.debug_to_file:
with open(session.config.debug_dir+'/connection_%d.log' % id, 'a') as f:
f.write(debug_str+'\n')
else:
debug_str = '[%s] %s' % (symbol, message)
if config.DEBUG_TO_FILE:
with open(config.DEBUG_DIR+'/debug.log', 'a') as f:
if session.config.debug_to_file:
with open(session.config.debug_dir+'/debug.log', 'a') as f:
f.write(debug_str+'\n')
if config.DEBUG_VERBOSITY >= verbosity_level:
if session.config.debug_verbosity >= verbosity_level:
print debug_str
def log_request(request, id=None, symbol='*', verbosity_level=3):
if config.DEBUG_TO_FILE or config.DEBUG_VERBOSITY > 0:
from pappyproxy.pappy import session
if session.config.debug_to_file or session.config.debug_verbosity > 0:
r_split = request.split('\r\n')
for l in r_split:
log(l, id, symbol, verbosity_level)
def get_endpoint(target_host, target_port, target_ssl, socks_config=None):
# Imports go here to allow mocking for tests
from twisted.internet.endpoints import SSL4ClientEndpoint, TCP4ClientEndpoint
from txsocksx.client import SOCKS5ClientEndpoint
from txsocksx.tls import TLSWrapClientEndpoint
from twisted.internet.interfaces import IOpenSSLClientConnectionCreator
if socks_config is not None:
sock_host = socks_config['host']
@ -102,6 +103,7 @@ class ProxyClient(LineReceiver):
self.request = request
self.data_defer = defer.Deferred()
self.completed = False
self.stream_response = True # used so child classes can temporarily turn off response streaming
self._response_obj = http.Response()
@ -112,17 +114,19 @@ class ProxyClient(LineReceiver):
line = args[0]
if line is None:
line = ''
self._response_obj.add_line(line)
self.log(line, symbol='r<', verbosity_level=3)
self._response_obj.add_line(line)
if self._response_obj.headers_complete:
self.setRawMode()
def rawDataReceived(self, *args, **kwargs):
from pappyproxy.pappy import session
data = args[0]
self.log('Returning data back through stream')
if not self._response_obj.complete:
if data:
if config.DEBUG_TO_FILE or config.DEBUG_VERBOSITY > 0:
if session.config.debug_to_file or session.config.debug_verbosity > 0:
s = printable_data(data)
dlines = s.split('\n')
for l in dlines:
@ -130,7 +134,7 @@ class ProxyClient(LineReceiver):
self._response_obj.add_data(data)
def dataReceived(self, data):
if self.factory.stream_response:
if self.factory.stream_response and self.stream_response:
self.factory.return_transport.write(data)
LineReceiver.dataReceived(self, data)
if not self.completed:
@ -159,6 +163,68 @@ class ProxyClient(LineReceiver):
def clientConnectionLost(self, connector, reason):
self.log("Connection with remote server lost: %s" % reason)
class UpstreamHTTPProxyClient(ProxyClient):
def __init__(self, request):
ProxyClient.__init__(self, request)
self.connect_response = False
self.proxy_connected = False
self.stream_response = False
self.creds = None
def write_proxied_request(self, request):
"""
Takes an unencrypted request and sends it to the proxy server to be
forwarded.
"""
sendreq = request.copy()
sendreq.path_type = http.PATH_ABSOLUTE
if self.creds is not None:
sendreq.proxy_creds = self.creds
lines = sendreq.full_request.splitlines()
for l in lines:
self.log(l, symbol='>r', verbosity_level=3)
self.transport.write(sendreq.full_message)
def connectionMade(self):
self.log("Connection made to http proxy", verbosity_level=3)
if not self.proxy_connected:
if self.request.is_ssl:
connreq = self.request.connect_request
self.connect_response = True
if self.creds is not None:
connreq.proxy_creds = self.creds
self.transport.write(connreq.full_message)
else:
self.proxy_connected = True
self.stream_response = True
self.write_proxied_request(self.request)
def handle_response_end(self, *args, **kwargs):
if self._response_obj.response_code == 407:
print "Incorrect credentials for HTTP proxy. Please check your username and password."
self.transport.loseConnection()
return
if self.proxy_connected:
self.log("Received request while connected, forwarding to http proxy", verbosity_level=3)
self.request.response = self._response_obj
self.transport.loseConnection()
assert self._response_obj.full_response
self.data_defer.callback(self.request)
elif self.connect_response:
self.log("Response to CONNECT request recieved from http proxy", verbosity_level=3)
self.proxy_connected = True
self.stream_response = True
self._response_obj = http.Response()
self.setLineMode()
self.completed = False
self._sent = False
self.transport.startTLS(ClientTLSContext())
lines = self.request.full_message.splitlines()
for l in lines:
self.log(l, symbol='>r', verbosity_level=3)
self.transport.write(self.request.full_message)
class ProxyClientFactory(ClientFactory):
@ -173,13 +239,22 @@ class ProxyClientFactory(ClientFactory):
self.stream_response = stream_response
self.return_transport = return_transport
self.intercepting_macros = {}
self.use_as_proxy = False
def log(self, message, symbol='*', verbosity_level=1):
log(message, id=self.connection_id, symbol=symbol, verbosity_level=verbosity_level)
def buildProtocol(self, addr, _do_callback=True):
from pappyproxy.pappy import session
# _do_callback is intended to help with testing and should not be modified
p = ProxyClient(self.request)
if self.use_as_proxy and context.in_scope(self.request):
p = UpstreamHTTPProxyClient(self.request)
if 'username' in session.config.http_proxy and 'password' in session.config.http_proxy:
username = session.config.http_proxy['username']
password = session.config.http_proxy['password']
p.creds = (username, password)
else:
p = ProxyClient(self.request)
p.factory = self
self.log("Building protocol", verbosity_level=3)
if _do_callback:
@ -198,8 +273,10 @@ class ProxyClientFactory(ClientFactory):
Prepares request for submitting
Saves the associated request with a temporary start time, mangles it, then
saves the mangled version with an update start time.
saves the mangled version with an update start time. Also updates flags
and values needed for submitting the request.
"""
from pappyproxy.pappy import session
sendreq = self.request
if context.in_scope(sendreq):
@ -217,6 +294,9 @@ class ProxyClientFactory(ClientFactory):
self.start_time = datetime.datetime.utcnow()
sendreq.time_start = self.start_time
yield sendreq.async_deep_save()
if session.config.http_proxy:
self.use_as_proxy = True
else:
self.log("Request out of scope, passing along unmangled")
self.request = sendreq
@ -227,11 +307,13 @@ class ProxyClientFactory(ClientFactory):
"""
If the request is in scope, it saves the completed request,
sets the start/end time, mangles the response, saves the
mangled version, then writes the response back through the
transport.
mangled version, then calls back data_defer with the mangled
request
"""
from pappyproxy.pappy import session
self.end_time = datetime.datetime.utcnow()
if config.DEBUG_TO_FILE or config.DEBUG_VERBOSITY > 0:
if session.config.debug_to_file or session.config.debug_verbosity > 0:
log_request(printable_data(request.response.full_response), id=self.connection_id, symbol='<m', verbosity_level=3)
request.time_start = self.start_time
@ -250,7 +332,7 @@ class ProxyClientFactory(ClientFactory):
if mangled and self.save_all:
yield request.async_deep_save()
if request.response and (config.DEBUG_TO_FILE or config.DEBUG_VERBOSITY > 0):
if request.response and (session.config.debug_to_file or session.config.debug_verbosity > 0):
log_request(printable_data(request.response.full_response),
id=self.connection_id, symbol='<', verbosity_level=3)
else:
@ -261,9 +343,12 @@ class ProxyClientFactory(ClientFactory):
class ProxyServerFactory(ServerFactory):
def __init__(self, save_all=False):
from pappyproxy.site import PappyWebServer
self.intercepting_macros = collections.OrderedDict()
self.save_all = save_all
self.force_ssl = False
self.web_server = PappyWebServer()
self.forward_host = None
def buildProtocol(self, addr):
@ -308,12 +393,11 @@ class ProxyServer(LineReceiver):
LineReceiver.dataReceived(self, *args, **kwargs)
if self._request_obj.complete:
try:
self.full_request_received()
except PappyException as e:
print str(e)
self.full_request_received()
def _start_tls(self, cert_host=None):
from pappyproxy.pappy import session
# Generate a cert for the hostname and start tls
if cert_host is None:
host = self._request_obj.host
@ -323,7 +407,7 @@ class ProxyServer(LineReceiver):
log("Generating cert for '%s'" % host,
verbosity_level=3)
(pkey, cert) = generate_cert(host,
config.CERT_DIR)
session.config.cert_dir)
cached_certs[host] = (pkey, cert)
else:
log("Using cached cert for %s" % host, verbosity_level=3)
@ -339,6 +423,7 @@ class ProxyServer(LineReceiver):
okay_str = 'HTTP/1.1 200 Connection established\r\n\r\n'
self.transport.write(okay_str)
@defer.inlineCallbacks
def full_request_received(self):
global cached_certs
@ -355,9 +440,10 @@ class ProxyServer(LineReceiver):
self.log('uri=%s, ssl=%s, connect_port=%s' % (self._connect_uri, self._connect_ssl, self._connect_port), verbosity_level=3)
forward = False
# if self._request_obj.host == 'pappy':
# self._create_pappy_response()
# forward = False
if self._request_obj.host == 'pappy':
yield self.factory.web_server.handle_request(self._request_obj)
self.transport.write(self._request_obj.response.full_message)
forward = False
# if _request_obj.host is a listener, forward = False
@ -411,6 +497,8 @@ class ProxyServer(LineReceiver):
Creates an endpoint to the target server using the given configuration
options then connects to the endpoint using self._client_factory
"""
from pappyproxy.pappy import session
self._request_obj = req
# If we have a socks proxy, wrap the endpoint in it
@ -421,11 +509,18 @@ class ProxyServer(LineReceiver):
if self.factory.forward_host:
self._request_obj.host = self.factory.forward_host
usehost = self._request_obj.host
useport = self._request_obj.port
usessl = self._request_obj.is_ssl
if session.config.http_proxy:
usehost = session.config.http_proxy['host']
useport = session.config.http_proxy['port']
usessl = False # We turn on ssl after CONNECT request if needed
self.log("Connecting to http proxy at %s:%d" % (usehost, useport))
# Get connection from the request
endpoint = get_endpoint(self._request_obj.host,
self._request_obj.port,
self._request_obj.is_ssl,
socks_config=config.SOCKS_PROXY)
endpoint = get_endpoint(usehost, useport, usessl,
socks_config=session.config.socks_proxy)
else:
endpoint = get_endpoint(self._request_obj.host,
self._request_obj.port,
@ -483,14 +578,15 @@ def generate_cert_serial():
return random.getrandbits(8*20)
def load_certs_from_dir(cert_dir):
from pappyproxy.pappy import session
try:
with open(cert_dir+'/'+config.SSL_CA_FILE, 'rt') as f:
with open(cert_dir+'/'+session.config.ssl_ca_file, 'rt') as f:
ca_raw = f.read()
except IOError:
raise PappyException("Could not load CA cert! Generate certs using the `gencerts` command then add the .crt file to your browser.")
try:
with open(cert_dir+'/'+config.SSL_PKEY_FILE, 'rt') as f:
with open(cert_dir+'/'+session.config.ssl_pkey_file, 'rt') as f:
ca_key_raw = f.read()
except IOError:
raise PappyException("Could not load CA private key!")
@ -519,6 +615,8 @@ def generate_cert(hostname, cert_dir):
def generate_ca_certs(cert_dir):
from pappyproxy.pappy import session
# Make directory if necessary
if not os.path.exists(cert_dir):
os.makedirs(cert_dir)
@ -527,7 +625,7 @@ def generate_ca_certs(cert_dir):
print "Generating private key... ",
key = crypto.PKey()
key.generate_key(crypto.TYPE_RSA, 2048)
with os.fdopen(os.open(cert_dir+'/'+config.SSL_PKEY_FILE, os.O_WRONLY | os.O_CREAT, 0o0600), 'w') as f:
with os.fdopen(os.open(cert_dir+'/'+session.config.ssl_pkey_file, os.O_WRONLY | os.O_CREAT, 0o0600), 'w') as f:
f.write(crypto.dump_privatekey(crypto.FILETYPE_PEM, key))
print "Done!"
@ -555,7 +653,7 @@ def generate_ca_certs(cert_dir):
])
cert.set_pubkey(key)
cert.sign(key, 'sha256')
with os.fdopen(os.open(cert_dir+'/'+config.SSL_CA_FILE, os.O_WRONLY | os.O_CREAT, 0o0600), 'w') as f:
with os.fdopen(os.open(cert_dir+'/'+session.config.ssl_ca_file, os.O_WRONLY | os.O_CREAT, 0o0600), 'w') as f:
f.write(crypto.dump_certificate(crypto.FILETYPE_PEM, cert))
print "Done!"

@ -198,7 +198,6 @@ class RequestCache(object):
"""
# Get the request
victim_id = self._min_time[0]
req = self._cached_reqs[victim_id]
self.evict(victim_id)
def _update_min(self, updated_reqid=None):

@ -0,0 +1,179 @@
import os
import mimetypes
from .http import Request, Response
from .util import PappyStringTransport, PappyException
from twisted.test.proto_helpers import StringTransport
from twisted.web.server import Site, NOT_DONE_YET
from twisted.web import static
from twisted.web.resource import Resource, NoResource
from jinja2 import Environment, FileSystemLoader
from twisted.internet import defer
## The web server class
class PappyWebServer(object):
"""
A class that is used to serve pages for requests to http://pappy. It is a
ghetto wrapper around a twisted web Site object. Give it a request object
and it will add a response to it.
NOINDEX
"""
from pappyproxy.pappy import session
site_dir = session.config.pappy_dir+'/site'
loader = FileSystemLoader(site_dir)
env = Environment(loader=loader)
def __init__(self):
root = RootResource(self.site_dir)
self.site = Site(root)
@staticmethod
def render_template(*args, **kwargs):
return PappyWebServer.env.get_template(args[0]).render(args[1:], **kwargs).encode('utf-8')
@defer.inlineCallbacks
def handle_request(self, req):
protocol = self.site.buildProtocol(None)
tr = PappyStringTransport()
protocol.makeConnection(tr)
protocol.dataReceived(req.full_request)
tr.waitForProducers()
## WORKING HERE
# use loading functions to load response
yield tr.complete_deferred
rsp_raw = tr.value()
rsp = Response(rsp_raw)
req.response = rsp
## functions
def blocking_string_request(func):
"""
Wrapper for blocking request handlers in resources. The custom string
transport has a deferred that must be called back when the messege is
complete. If the message blocks though, you can just call it back right away
NOINDEX
"""
def f(self, request):
request.transport.complete_deferred.callback(None)
return func(self, request)
return f
## Resources
class PappyResource(Resource):
"""
Helper class for site resources.
NOINDEX
"""
def getChild(self, name, request):
if name == '':
return self
return Resource.getChild(self, name, request)
class RootResource(PappyResource):
def __init__(self, site_dir):
PappyResource.__init__(self)
self.site_dir = site_dir
self.dirListing = False
# Static resource
self.static_resource = NoDirFile(self.site_dir + '/static')
self.putChild('static', self.static_resource)
# Cert download resource
self.putChild('certs', CertResource())
# Response viewing resource
self.putChild('rsp', ResponseResource())
@blocking_string_request
def render_GET(self, request):
return PappyWebServer.render_template('index.html')
class NoDirFile(static.File):
def directoryListing(self):
return NoResource()
@blocking_string_request
def render_GET(self, request):
return static.File.render_GET(self, request)
## Cert resources
class CertResource(PappyResource):
def __init__(self):
PappyResource.__init__(self)
self.putChild('download', CertDownload())
@blocking_string_request
def render_GET(self, request):
return PappyWebServer.render_template('certs.html')
class CertDownload(PappyResource):
@blocking_string_request
def render_GET(self, request):
from .pappy import session
cert_dir = session.config.cert_dir
ssl_ca_file = session.config.ssl_ca_file
with open(os.path.join(cert_dir, ssl_ca_file), 'r') as f:
ca_raw = f.read()
request.responseHeaders.addRawHeader("Content-Type", "application/x-x509-ca-cert")
return ca_raw
## View responses
class ResponseResource(PappyResource):
def getChild(self, name, request):
if name == '':
return self
return ViewResponseResource(name)
@blocking_string_request
def render_GET(self, request):
return PappyWebServer.render_template('viewrsp.html')
class ViewResponseResource(PappyResource):
def __init__(self, reqid):
PappyResource.__init__(self)
self.reqid = reqid
def render_GET(self, request):
d = Request.load_request(self.reqid)
d.addCallback(self._render_response, request)
d.addErrback(self._render_response_err, request)
d.addCallback(lambda _: request.transport.complete_deferred.callback(None))
return NOT_DONE_YET
def _render_response(self, req, tw_request):
if req.response:
if not req.response.body:
raise PappyException("Response has no body")
if 'content-type' in req.response.headers:
tw_request.responseHeaders.addRawHeader("Content-Type", req.response.headers['content-type'])
else:
guess = mimetypes.guess_type(req.url)
if guess[0]:
tw_request.responseHeaders.addRawHeader("Content-Type", guess[0])
tw_request.write(req.response.body)
else:
tw_request.write(PappyWebServer.render_template('norsp.html'))
tw_request.finish()
def _render_response_err(self, err, tw_request):
tw_request.write(PappyWebServer.render_template('norsp.html', errmsg=err.getErrorMessage()))
tw_request.finish()
err.trap(Exception)

@ -0,0 +1,11 @@
<html>
<head>
<title>Pappy</title>
</head>
<body style="background-color: #414141">
<div style="padding: 12pt; width:960px; margin:auto; background-color: #AAA">
<h1>Pappy</h1>
{% block body %}{% endblock %}
</div>
</body>
</html>

@ -0,0 +1,6 @@
{% extends "base.html" %}
{% block body %}
<h2>Cert Download</h2>
Click <a href="/certs/download">here to download the CA cert.</a>
{% endblock %}

@ -0,0 +1,8 @@
{% extends "base.html" %}
{% block body %}
<ul>
<li><a href="/certs">Certs</a></li>
<li>View responses in browser from <a href="http://pappy/rsp">http://pappy/rsp/&lt;reqid&gt;</a>
</ul>
{% endblock %}

@ -0,0 +1,8 @@
{% extends "base.html" %}
{% block body %}
<h2>Unable To Return Response Body</h2>
{% if errmsg %}
<p>{{ errmsg }}</p>
{% endif %}
{% endblock %}

@ -0,0 +1,6 @@
{% extends "base.html" %}
{% block body %}
<h2>View Response</h2>
<p>View http://pappy/rsp/&lt;id&gt; to view a response in your browser. The body of the response returned to your browser will be the same, but the headers will not.</p>
{% endblock %}

@ -0,0 +1,112 @@
import base64
import pytest
import mock
import json
import datetime
import pappyproxy
from pappyproxy.util import PappyException
from pappyproxy.comm import CommServer
from pappyproxy.http import Request, Response
from testutil import mock_deferred, func_deleted, TLSStringTransport, freeze, mock_int_macro, no_tcp
@pytest.fixture
def http_request():
req = Request('GET / HTTP/1.1\r\n\r\n')
req.host = 'www.foo.faketld'
req.port = '1337'
req.is_ssl = True
req.reqid = 123
rsp = Response('HTTP/1.1 200 OK\r\n\r\n')
req.response = rsp
return req
def perform_comm(line):
serv = CommServer()
serv.transport = TLSStringTransport()
serv.lineReceived(line)
n = datetime.datetime.now()
while serv.transport.value() == '':
t = datetime.datetime.now()
if (t-n).total_seconds() > 5:
raise Exception("Request timed out")
return serv.transport.value()
def test_simple():
v = perform_comm('{"action": "ping"}')
assert json.loads(v) == {'ping': 'pong', 'success': True}
def mock_loader(rsp):
def f(*args, **kwargs):
return rsp
return classmethod(f)
def mock_loader_fail():
def f(*args, **kwargs):
raise PappyException("lololo message don't exist dawg")
return classmethod(f)
def test_get_request(mocker, http_request):
mocker.patch.object(pappyproxy.http.Request, 'load_request', new=mock_loader(http_request))
v = perform_comm('{"action": "get_request", "reqid": "1"}')
expected_data = json.loads(http_request.to_json())
expected_data['success'] = True
assert json.loads(v) == expected_data
def test_get_request_fail(mocker, http_request):
mocker.patch.object(pappyproxy.http.Request, 'load_request', new=mock_loader_fail())
v = json.loads(perform_comm('{"action": "get_request", "reqid": "1"}'))
assert v['success'] == False
assert 'message' in v
def test_get_response(mocker, http_request):
mocker.patch.object(pappyproxy.http.Request, 'load_request', new=mock_loader(http_request))
mocker.patch.object(pappyproxy.http.Response, 'load_response', new=mock_loader(http_request.response))
v = perform_comm('{"action": "get_response", "reqid": "1"}')
expected_data = json.loads(http_request.response.to_json())
expected_data['success'] = True
assert json.loads(v) == expected_data
def test_get_response_fail(mocker, http_request):
mocker.patch.object(pappyproxy.http.Request, 'load_request', new=mock_loader(http_request))
mocker.patch.object(pappyproxy.http.Response, 'load_response', new=mock_loader_fail())
v = json.loads(perform_comm('{"action": "get_response", "reqid": "1"}'))
assert v['success'] == False
assert 'message' in v
def test_submit_request(mocker, http_request):
mocker.patch.object(pappyproxy.http.Request, 'submit_new', new=mock_loader(http_request))
mocker.patch('pappyproxy.http.Request.async_deep_save').return_value = mock_deferred()
comm_data = {"action": "submit"}
comm_data['host'] = http_request.host
comm_data['port'] = http_request.port
comm_data['is_ssl'] = http_request.is_ssl
comm_data['full_message'] = base64.b64encode(http_request.full_message)
comm_data['tags'] = ['footag']
v = perform_comm(json.dumps(comm_data))
expected_data = {}
expected_data['request'] = json.loads(http_request.to_json())
expected_data['response'] = json.loads(http_request.response.to_json())
expected_data['success'] = True
expected_data['request']['tags'] = ['footag']
assert json.loads(v) == expected_data
def test_submit_request_fail(mocker, http_request):
mocker.patch.object(pappyproxy.http.Request, 'submit_new', new=mock_loader_fail())
mocker.patch('pappyproxy.http.Request.async_deep_save').return_value = mock_deferred()
comm_data = {"action": "submit"}
comm_data['full_message'] = base64.b64encode('HELLO THIS IS REQUEST\r\nWHAT IS HEADER FORMAT\r\n')
v = json.loads(perform_comm(json.dumps(comm_data)))
print v
assert v['success'] == False
assert 'message' in v

@ -885,6 +885,42 @@ def test_request_modify_header2():
'\r\n'
'foo=barr')
def test_request_absolute_url():
r = http.Request(('GET /foo/path HTTP/1.1\r\n'
'Host: www.example.faketld\r\n\r\n'))
assert r.full_message == ('GET /foo/path HTTP/1.1\r\n'
'Host: www.example.faketld\r\n\r\n')
r.path_type = http.PATH_ABSOLUTE
assert r.full_message == ('GET http://www.example.faketld/foo/path HTTP/1.1\r\n'
'Host: www.example.faketld\r\n\r\n')
r.is_ssl = True
assert r.full_message == ('GET https://www.example.faketld/foo/path HTTP/1.1\r\n'
'Host: www.example.faketld\r\n\r\n')
def test_proxy_auth():
r = http.Request(('GET /foo/path HTTP/1.1\r\n'
'Host: www.example.faketld\r\n\r\n'))
r.proxy_creds = ('username', 'pass:word')
assert r.full_message == ('GET /foo/path HTTP/1.1\r\n'
'Host: www.example.faketld\r\n'
'Proxy-Authorization: Basic dXNlcm5hbWU6cGFzczp3b3Jk\r\n\r\n')
def test_request_connect_request():
r = http.Request(('GET /foo/path HTTP/1.1\r\n'
'Host: www.example.faketld\r\n\r\n'))
assert r.connect_request == None
r.is_ssl = True
assert r.connect_request.full_message == ('CONNECT www.example.faketld:443 HTTP/1.1\r\n'
'Host: www.example.faketld\r\n\r\n')
def test_request_connect_request_creds():
r = http.Request(('GET /foo/path HTTP/1.1\r\n'
'Host: www.example.faketld\r\n\r\n'))
r.is_ssl = True
r.proxy_creds = ('username', 'pass:word')
assert r.connect_request.full_message == ('CONNECT www.example.faketld:443 HTTP/1.1\r\n'
'Host: www.example.faketld\r\n'
'Proxy-Authorization: Basic dXNlcm5hbWU6cGFzczp3b3Jk\r\n\r\n')
####################
## Response tests
@ -1301,3 +1337,10 @@ def test_response_delete_cookie():
r.delete_cookie('foo')
assert r.full_response == ('HTTP/1.1 200 OK\r\n'
'Content-Length: 0\r\n\r\n')
def test_response_short_statusline():
r = http.Response('HTTP/1.1 407\r\n\r\n')
assert r.status_line == 'HTTP/1.1 407'
assert r.response_text == ''
assert r.version == 'HTTP/1.1'
assert r.response_code == 407

@ -3,10 +3,12 @@ import mock
import random
import datetime
import pappyproxy
import base64
from pappyproxy import http
from pappyproxy.proxy import ProxyClientFactory, ProxyServerFactory
from pappyproxy.proxy import ProxyClientFactory, ProxyServerFactory, UpstreamHTTPProxyClient
from testutil import mock_deferred, func_deleted, TLSStringTransport, freeze, mock_int_macro, no_tcp
from twisted.internet import defer
@pytest.fixture(autouse=True)
def proxy_patches(mocker):
@ -17,8 +19,17 @@ def proxy_patches(mocker):
def server_factory():
return gen_server_factory()
@pytest.fixture(autouse=True)
def mock_config(mocker):
c = pappyproxy.config.PappyConfig()
s = pappyproxy.pappy.PappySession(c)
mocker.patch.object(pappyproxy.pappy, 'session', new=s)
def socks_config(mocker, config):
mocker.patch('pappyproxy.config.SOCKS_PROXY', new=config)
pappyproxy.pappy.session.config.socks_proxy = config
def http_proxy_config(mocker, config):
pappyproxy.pappy.session.config.http_proxy = config
def gen_server_factory(int_macros={}):
factory = ProxyServerFactory()
@ -33,16 +44,18 @@ def gen_server_protocol(int_macros={}):
protocol.makeConnection(tr)
return protocol
def gen_client_protocol(req, stream_response=False):
@defer.inlineCallbacks
def gen_client_protocol(req, stream_response=False, save_all=True):
return_transport = TLSStringTransport()
factory = ProxyClientFactory(req,
save_all=True,
save_all=save_all,
stream_response=stream_response,
return_transport=return_transport)
yield factory.prepare_request()
protocol = factory.buildProtocol(('127.0.0.1', 0), _do_callback=False)
tr = TLSStringTransport()
protocol.makeConnection(tr)
return protocol
defer.returnValue(protocol)
@pytest.fixture
def server_protocol():
@ -52,6 +65,12 @@ def mock_req_async_save(req):
req.reqid = str(random.randint(1,1000000))
return mock_deferred()
def mock_mangle_response_side_effect(new_rsp):
def f(request, mangle_macros):
request.response = new_rsp
return mock_deferred(True)
return f
####################
## Mock functions
@ -559,37 +578,284 @@ def test_proxy_client_factory_prepare_mangle_req(mocker, freeze):
### return_request_pair
# @pytest.inlineCallbacks
# def test_proxy_client_factory_prepare_mangle_rsp(mocker, freeze):
@pytest.inlineCallbacks
def test_proxy_client_factory_return_request_pair_simple(mocker, freeze):
"""
Make sure the proxy doesn't do anything if the request is out of scope
"""
freeze.freeze(datetime.datetime(2015, 1, 1, 3, 30, 15, 50))
rsave = mocker.patch.object(pappyproxy.http.Request, 'async_deep_save', autospec=True, side_effect=mock_req_async_save)
mocker.patch('pappyproxy.context.in_scope').return_value = False
req = http.Request('GET / HTTP/1.1\r\n\r\n')
req.reqid = 1
rsp = http.Response('HTTP/1.1 200 OK\r\n\r\n')
checkrsp = rsp.copy()
req.response = rsp
mocker.patch('pappyproxy.macros.mangle_response').return_value = mock_deferred(False)
cf = ProxyClientFactory(req,
save_all=False,
stream_response=False,
return_transport=None)
cf.start_time = datetime.datetime(2015, 1, 1, 3, 30, 14, 50)
cf.return_request_pair(req)
result = yield cf.data_defer
assert result == req
assert result.response == checkrsp
assert req.time_start == datetime.datetime(2015, 1, 1, 3, 30, 14, 50)
assert req.time_end == datetime.datetime(2015, 1, 1, 3, 30, 15, 50)
assert len(rsave.mock_calls) == 0
@pytest.inlineCallbacks
def test_proxy_client_factory_return_request_pair_mangle(mocker, freeze):
"""
Make one modification to the response
"""
freeze.freeze(datetime.datetime(2015, 1, 1, 3, 30, 15, 50))
rsave = mocker.patch.object(pappyproxy.http.Request, 'async_deep_save', autospec=True, side_effect=mock_req_async_save)
mocker.patch('pappyproxy.context.in_scope').return_value = True
req = http.Request('GET / HTTP/1.1\r\n\r\n')
req.reqid = 1
rsp = http.Response('HTTP/1.1 200 OK\r\n\r\n')
req.response = rsp
new_rsp = http.Response('HTTP/1.1 6969 LOLMANGLED\r\n\r\n')
checkrsp = new_rsp.copy()
# freeze.freeze(datetime.datetime(2015, 1, 1, 3, 30, 15, 50))
# rsave = mocker.patch.object(pappyproxy.http.Request, 'async_deep_save', autospec=True, side_effect=mock_req_async_save)
# mocker.patch('pappyproxy.context.in_scope').return_value = True
mocker.patch('pappyproxy.macros.mangle_response',
side_effect=mock_mangle_response_side_effect(new_rsp))
# req = http.Request('GET / HTTP/1.1\r\n\r\n')
# req.reqid = 1
# rsp = http.Response('HTTP/1.1 200 OK\r\n\r\n')
# req.response = rsp
cf = ProxyClientFactory(req,
save_all=True,
stream_response=False,
return_transport=None)
cf.start_time = datetime.datetime(2015, 1, 1, 3, 30, 14, 50)
cf.return_request_pair(req)
result = yield cf.data_defer
assert result == req
assert result.response == checkrsp
assert req.time_start == datetime.datetime(2015, 1, 1, 3, 30, 14, 50)
assert req.time_end == datetime.datetime(2015, 1, 1, 3, 30, 15, 50)
assert len(rsave.mock_calls) == 2
# mocker.patch('pappyproxy.macros.mangle_response').return_value = (req, False)
@pytest.inlineCallbacks
def test_proxy_client_factory_return_request_pair_no_save_all(mocker, freeze):
"""
Make one modification to the response but don't save it
"""
# cf = ProxyClientFactory(req,
# save_all=False,
# stream_response=False,
# return_transport=None)
# result = yield cf.return_request_pair(req)
# assert result == req
# assert req.time_start == datetime.datetime(2015, 1, 1, 3, 30, 15, 50)
# assert len(rsave.mock_calls) == 0
freeze.freeze(datetime.datetime(2015, 1, 1, 3, 30, 15, 50))
rsave = mocker.patch.object(pappyproxy.http.Request, 'async_deep_save', autospec=True, side_effect=mock_req_async_save)
mocker.patch('pappyproxy.context.in_scope').return_value = True
req = http.Request('GET / HTTP/1.1\r\n\r\n')
req.reqid = 1
rsp = http.Response('HTTP/1.1 200 OK\r\n\r\n')
req.response = rsp
new_rsp = http.Response('HTTP/1.1 6969 LOLMANGLED\r\n\r\n')
checkrsp = new_rsp.copy()
mocker.patch('pappyproxy.macros.mangle_response',
side_effect=mock_mangle_response_side_effect(new_rsp)).return_value = mock_deferred(True)
cf = ProxyClientFactory(req,
save_all=False,
stream_response=False,
return_transport=None)
cf.start_time = datetime.datetime(2015, 1, 1, 3, 30, 14, 50)
cf.return_request_pair(req)
result = yield cf.data_defer
assert result == req
assert result.response == checkrsp
assert req.time_start == datetime.datetime(2015, 1, 1, 3, 30, 14, 50)
assert req.time_end == datetime.datetime(2015, 1, 1, 3, 30, 15, 50)
assert len(rsave.mock_calls) == 0
@pytest.inlineCallbacks
def test_proxy_client_factory_return_request_pair_save_all_no_mangle(mocker, freeze):
"""
Make one modification to the response but don't save it
"""
freeze.freeze(datetime.datetime(2015, 1, 1, 3, 30, 15, 50))
rsave = mocker.patch.object(pappyproxy.http.Request, 'async_deep_save', autospec=True, side_effect=mock_req_async_save)
mocker.patch('pappyproxy.context.in_scope').return_value = True
req = http.Request('GET / HTTP/1.1\r\n\r\n')
req.reqid = 1
rsp = http.Response('HTTP/1.1 200 OK\r\n\r\n')
checkrsp = rsp.copy()
req.response = rsp
mocker.patch('pappyproxy.macros.mangle_response').return_value = mock_deferred(False)
cf = ProxyClientFactory(req,
save_all=True,
stream_response=False,
return_transport=None)
cf.start_time = datetime.datetime(2015, 1, 1, 3, 30, 14, 50)
cf.return_request_pair(req)
result = yield cf.data_defer
assert result == req
assert result.response == checkrsp
assert req.time_start == datetime.datetime(2015, 1, 1, 3, 30, 14, 50)
assert req.time_end == datetime.datetime(2015, 1, 1, 3, 30, 15, 50)
assert len(rsave.mock_calls) == 1
@pytest.inlineCallbacks
def test_proxy_client_factory_build_protocol_http_proxy(mocker):
http_proxy_config(mocker, {'host': '12345', 'port': 12345})
r = http.Request('GET / HTTP/1.1\r\n\r\n')
cf = ProxyClientFactory(r,
save_all=False,
stream_response=False,
return_transport=None)
yield cf.prepare_request()
p = cf.buildProtocol('')
assert isinstance(p, UpstreamHTTPProxyClient)
assert p.creds is None
assert p.proxy_connected == False
@pytest.inlineCallbacks
def test_proxy_client_factory_build_protocol_http_proxy_username_only(mocker):
http_proxy_config(mocker, {'host': '12345', 'port': 12345, 'username': 'foo'})
r = http.Request('GET / HTTP/1.1\r\n\r\n')
cf = ProxyClientFactory(r,
save_all=False,
stream_response=False,
return_transport=None)
yield cf.prepare_request()
p = cf.buildProtocol('')
assert p.creds is None
@pytest.inlineCallbacks
def test_proxy_client_factory_build_protocol_http_proxy_username_only(mocker):
http_proxy_config(mocker, {'host': '12345', 'port': 12345, 'username': 'foo', 'password': 'password'})
r = http.Request('GET / HTTP/1.1\r\n\r\n')
cf = ProxyClientFactory(r,
save_all=False,
stream_response=False,
return_transport=None)
yield cf.prepare_request()
p = cf.buildProtocol('')
assert p.creds == ('foo', 'password')
@pytest.inlineCallbacks
def test_proxy_upstream_client_connection_made(mocker):
http_proxy_config(mocker, {'host': '12345', 'port': 12345})
r = http.Request(('GET / HTTP/1.1\r\n'
'Host: www.example.faketld\r\n\r\n'))
p = yield gen_client_protocol(r, save_all=False)
assert isinstance(p, UpstreamHTTPProxyClient)
assert p.transport.value() == ('GET http://www.example.faketld/ HTTP/1.1\r\n'
'Host: www.example.faketld\r\n\r\n')
@pytest.inlineCallbacks
def test_proxy_upstream_client_connection_made_creds(mocker):
http_proxy_config(mocker, {'host': '12345', 'port': 12345, 'username':'foo', 'password':'password'})
r = http.Request(('GET / HTTP/1.1\r\n'
'Host: www.example.faketld\r\n\r\n'))
p = yield gen_client_protocol(r, save_all=False)
assert isinstance(p, UpstreamHTTPProxyClient)
assert p.transport.value() == ('GET http://www.example.faketld/ HTTP/1.1\r\n'
'Host: www.example.faketld\r\n'
'Proxy-Authorization: Basic %s\r\n\r\n') % base64.b64encode('foo:password')
@pytest.inlineCallbacks
def test_proxy_upstream_client_connection_made_ssl(mocker):
http_proxy_config(mocker, {'host': '12345', 'port': 12345})
r = http.Request(('GET / HTTP/1.1\r\n'
'Host: www.example.faketld\r\n\r\n'))
r.is_ssl = True
p = yield gen_client_protocol(r, save_all=False)
assert isinstance(p, UpstreamHTTPProxyClient)
assert p.transport.value() == ('CONNECT www.example.faketld:443 HTTP/1.1\r\n'
'Host: www.example.faketld\r\n\r\n')
@pytest.inlineCallbacks
def test_proxy_upstream_client_connection_made_ssl_creds(mocker):
http_proxy_config(mocker, {'host': '12345', 'port': 12345, 'username':'foo', 'password':'password'})
r = http.Request(('GET / HTTP/1.1\r\n'
'Host: www.example.faketld\r\n\r\n'))
r.is_ssl = True
p = yield gen_client_protocol(r, save_all=False)
assert isinstance(p, UpstreamHTTPProxyClient)
assert p.transport.value() == ('CONNECT www.example.faketld:443 HTTP/1.1\r\n'
'Host: www.example.faketld\r\n'
'Proxy-Authorization: Basic %s\r\n\r\n') % base64.b64encode('foo:password')
@pytest.inlineCallbacks
def test_proxy_upstream_client_connection_made_ssl(mocker):
http_proxy_config(mocker, {'host': '12345', 'port': 12345})
mstarttls = mocker.patch('pappyproxy.tests.testutil.TLSStringTransport.startTLS')
r = http.Request(('GET / HTTP/1.1\r\n'
'Host: www.example.faketld\r\n\r\n'))
r.is_ssl = True
p = yield gen_client_protocol(r, save_all=False)
assert p.transport.value() == ('CONNECT www.example.faketld:443 HTTP/1.1\r\n'
'Host: www.example.faketld\r\n\r\n')
assert not mstarttls.called
p.transport.clear()
p.dataReceived('HTTP/1.1 200 OK\r\n\r\n')
assert mstarttls.called
assert p.transport.value() == ('GET / HTTP/1.1\r\n'
'Host: www.example.faketld\r\n\r\n')
@pytest.inlineCallbacks
def test_proxy_upstream_client_connection_made_ssl_creds(mocker):
http_proxy_config(mocker, {'host': '12345', 'port': 12345, 'username':'foo', 'password':'password'})
mstarttls = mocker.patch('pappyproxy.tests.testutil.TLSStringTransport.startTLS')
r = http.Request(('GET / HTTP/1.1\r\n'
'Host: www.example.faketld\r\n\r\n'))
r.is_ssl = True
p = yield gen_client_protocol(r, save_all=False)
assert p.transport.value() == ('CONNECT www.example.faketld:443 HTTP/1.1\r\n'
'Host: www.example.faketld\r\n'
'Proxy-Authorization: Basic %s\r\n\r\n') % base64.b64encode('foo:password')
assert not mstarttls.called
p.transport.clear()
p.dataReceived('HTTP/1.1 200 OK\r\n\r\n')
assert mstarttls.called
assert p.transport.value() == ('GET / HTTP/1.1\r\n'
'Host: www.example.faketld\r\n\r\n')
@pytest.inlineCallbacks
def test_proxy_upstream_client_connection_incorrect_creds(mocker):
http_proxy_config(mocker, {'host': '12345', 'port': 12345, 'username':'foo', 'password':'password'})
mstarttls = mocker.patch('pappyproxy.tests.testutil.TLSStringTransport.startTLS')
closed = mocker.patch('pappyproxy.tests.testutil.TLSStringTransport.loseConnection')
r = http.Request(('GET / HTTP/1.1\r\n'
'Host: www.example.faketld\r\n\r\n'))
r.is_ssl = True
p = yield gen_client_protocol(r, save_all=False)
assert p.transport.value() == ('CONNECT www.example.faketld:443 HTTP/1.1\r\n'
'Host: www.example.faketld\r\n'
'Proxy-Authorization: Basic %s\r\n\r\n') % base64.b64encode('foo:password')
p.transport.clear()
p.dataReceived('HTTP/1.1 407 YOU DUN FUCKED UP\r\n\r\n')
assert not mstarttls.called
assert p.transport.value() == ''
assert closed.called
### ProxyClient tests
@pytest.inlineCallbacks
def test_proxy_client_simple(mocker):
rsave = mocker.patch.object(pappyproxy.http.Request, 'async_deep_save', autospec=True, side_effect=mock_req_async_save)
req = http.Request('GET / HTTP/1.1\r\n\r\n')
client = gen_client_protocol(req, stream_response=False)
client = yield gen_client_protocol(req, stream_response=False)
assert client.transport.value() == 'GET / HTTP/1.1\r\n\r\n'
client.transport.clear()
rsp = 'HTTP/1.1 200 OKILE DOKELY\r\n\r\n'
@ -602,7 +868,7 @@ def test_proxy_client_simple(mocker):
def test_proxy_client_stream(mocker):
rsave = mocker.patch.object(pappyproxy.http.Request, 'async_deep_save', autospec=True, side_effect=mock_req_async_save)
req = http.Request('GET / HTTP/1.1\r\n\r\n')
client = gen_client_protocol(req, stream_response=True)
client = yield gen_client_protocol(req, stream_response=True)
client.transport.clear()
client.dataReceived('HTTP/1.1 404 GET FUCKE')
assert client.factory.return_transport.value() == 'HTTP/1.1 404 GET FUCKE'
@ -617,7 +883,7 @@ def test_proxy_client_stream(mocker):
def test_proxy_client_nostream(mocker):
rsave = mocker.patch.object(pappyproxy.http.Request, 'async_deep_save', autospec=True, side_effect=mock_req_async_save)
req = http.Request('GET / HTTP/1.1\r\n\r\n')
client = gen_client_protocol(req, stream_response=False)
client = yield gen_client_protocol(req, stream_response=False)
client.transport.clear()
client.dataReceived('HTTP/1.1 404 GET FUCKE')
assert client.factory.return_transport.value() == ''

@ -15,6 +15,26 @@ class TLSStringTransport(StringTransport):
def startTLS(self, context, factory):
pass
class PappySession(object):
def setup():
"""
Sets up a console session with a connection to a temporary datafile
"""
pass
def cleanup():
"""
Closes connections, deletes temporary datafile
"""
pass
def run_command(command):
"""
Runs the command then returns the non-colorized output
"""
pass
def func_deleted(*args, **kwargs):
raise NotImplementedError()

@ -1,9 +1,13 @@
import StringIO
import datetime
import re
import string
import sys
import time
import datetime
from .colors import Colors, Styles
from .colors import Styles, Colors, verb_color, scode_color, path_formatter, host_color
from twisted.internet import defer
from twisted.test.proto_helpers import StringTransport
class PappyException(Exception):
"""
@ -12,6 +16,30 @@ class PappyException(Exception):
"""
pass
class PappyStringTransport(StringTransport):
def __init__(self):
StringTransport.__init__(self)
self.complete_deferred = defer.Deferred()
def finish(self):
# Called when a finishable producer finishes
self.producerState = 'stopped'
def registerProducer(self, producer, streaming):
StringTransport.registerProducer(self, producer, streaming)
def waitForProducers(self):
while self.producer and self.producerState == 'producing':
self.producer.resumeProducing()
def loseConnection(self):
StringTransport.loseconnection(self)
self.complete_deferred.callback(None)
def startTLS(self, context, factory):
pass
def printable_data(data):
"""
Return ``data``, but replaces unprintable characters with periods.
@ -55,3 +83,240 @@ def hexdump(src, length=16):
printable = ''.join(["%s" % ((ord(x) <= 127 and FILTER[ord(x)]) or Styles.UNPRINTABLE_DATA+'.'+Colors.ENDC) for x in chars])
lines.append("%04x %-*s %s\n" % (c, length*3, hex, printable))
return ''.join(lines)
# Taken from http://stackoverflow.com/questions/16571150/how-to-capture-stdout-output-from-a-python-function-call
# then modified
class Capturing():
def __enter__(self):
self._stdout = sys.stdout
sys.stdout = self._stringio = StringIO.StringIO()
return self
def __exit__(self, *args):
self.val = self._stringio.getvalue()
sys.stdout = self._stdout
@defer.inlineCallbacks
def load_reqlist(line, allow_special=True, ids_only=False):
"""
load_reqlist(line, allow_special=True)
A helper function for parsing a list of requests that are passed as an
argument. If ``allow_special`` is True, then it will parse IDs such as
``u123`` or ``s123``. Even if allow_special is false, it will still parse
``m##`` IDs. Will print any errors with loading any of the requests and
will return a list of all the requests which were successfully loaded.
Returns a deferred.
:Returns: Twisted deferred
"""
from .http import Request
# Parses a comma separated list of ids and returns a list of those requests
# prints any errors
if not line:
raise PappyException('Request id(s) required')
ids = re.split(',\s*', line)
reqs = []
if not ids_only:
for reqid in ids:
try:
req = yield Request.load_request(reqid, allow_special)
reqs.append(req)
except PappyException as e:
print e
defer.returnValue(reqs)
else:
defer.returnValue(ids)
def print_table(coldata, rows):
"""
Print a table.
Coldata: List of dicts with info on how to print the columns.
``name`` is the heading to give column,
``width (optional)`` maximum width before truncating. 0 for unlimited.
Rows: List of tuples with the data to print
"""
# Get the width of each column
widths = []
headers = []
for data in coldata:
if 'name' in data:
headers.append(data['name'])
else:
headers.append('')
empty_headers = True
for h in headers:
if h != '':
empty_headers = False
if not empty_headers:
rows = [headers] + rows
for i in range(len(coldata)):
col = coldata[i]
if 'width' in col and col['width'] > 0:
maxwidth = col['width']
else:
maxwidth = 0
colwidth = 0
for row in rows:
printdata = row[i]
if isinstance(printdata, dict):
collen = len(str(printdata['data']))
else:
collen = len(str(printdata))
if collen > colwidth:
colwidth = collen
if maxwidth > 0 and colwidth > maxwidth:
widths.append(maxwidth)
else:
widths.append(colwidth)
# Print rows
padding = 2
is_heading = not empty_headers
for row in rows:
if is_heading:
sys.stdout.write(Styles.TABLE_HEADER)
for (col, width) in zip(row, widths):
if isinstance(col, dict):
printstr = str(col['data'])
if 'color' in col:
colors = col['color']
formatter = None
elif 'formatter' in col:
colors = None
formatter = col['formatter']
else:
colors = None
formatter = None
else:
printstr = str(col)
colors = None
formatter = None
if len(printstr) > width:
trunc_printstr=printstr[:width]
trunc_printstr=trunc_printstr[:-3]+'...'
else:
trunc_printstr=printstr
if colors is not None:
sys.stdout.write(colors)
sys.stdout.write(trunc_printstr)
sys.stdout.write(Colors.ENDC)
elif formatter is not None:
toprint = formatter(printstr, width)
sys.stdout.write(toprint)
else:
sys.stdout.write(trunc_printstr)
sys.stdout.write(' '*(width-len(printstr)))
sys.stdout.write(' '*padding)
if is_heading:
sys.stdout.write(Colors.ENDC)
is_heading = False
sys.stdout.write('\n')
sys.stdout.flush()
def print_requests(requests):
"""
Takes in a list of requests and prints a table with data on each of the
requests. It's the same table that's used by ``ls``.
"""
rows = []
for req in requests:
rows.append(get_req_data_row(req))
print_request_rows(rows)
def print_request_rows(request_rows):
"""
Takes in a list of request rows generated from :func:`pappyproxy.console.get_req_data_row`
and prints a table with data on each of the
requests. Used instead of :func:`pappyproxy.console.print_requests` if you
can't count on storing all the requests in memory at once.
"""
# Print a table with info on all the requests in the list
cols = [
{'name':'ID'},
{'name':'Verb'},
{'name': 'Host'},
{'name':'Path', 'width':40},
{'name':'S-Code', 'width':16},
{'name':'Req Len'},
{'name':'Rsp Len'},
{'name':'Time'},
{'name':'Mngl'},
]
print_rows = []
for row in request_rows:
(reqid, verb, host, path, scode, qlen, slen, time, mngl) = row
verb = {'data':verb, 'color':verb_color(verb)}
scode = {'data':scode, 'color':scode_color(scode)}
host = {'data':host, 'color':host_color(host)}
path = {'data':path, 'formatter':path_formatter}
print_rows.append((reqid, verb, host, path, scode, qlen, slen, time, mngl))
print_table(cols, print_rows)
def get_req_data_row(request):
"""
Get the row data for a request to be printed.
"""
rid = request.reqid
method = request.verb
if 'host' in request.headers:
host = request.headers['host']
else:
host = '??'
path = request.full_path
reqlen = len(request.body)
rsplen = 'N/A'
mangle_str = '--'
if request.unmangled:
mangle_str = 'q'
if request.response:
response_code = str(request.response.response_code) + \
' ' + request.response.response_text
rsplen = len(request.response.body)
if request.response.unmangled:
if mangle_str == '--':
mangle_str = 's'
else:
mangle_str += '/s'
else:
response_code = ''
time_str = '--'
if request.time_start and request.time_end:
time_delt = request.time_end - request.time_start
time_str = "%.2f" % time_delt.total_seconds()
return [rid, method, host, path, response_code,
reqlen, rsplen, time_str, mangle_str]
def confirm(message, default='n'):
"""
A helper function to get confirmation from the user. It prints ``message``
then asks the user to answer yes or no. Returns True if the user answers
yes, otherwise returns False.
"""
if 'n' in default.lower():
default = False
else:
default = True
print message
if default:
answer = raw_input('(Y/n) ')
else:
answer = raw_input('(y/N) ')
if not answer:
return default
if answer[0].lower() == 'y':
return True
else:
return False

@ -1,9 +1,10 @@
#!/usr/bin/env python
import pkgutil
import pappyproxy
from setuptools import setup, find_packages
VERSION = '0.2.7'
VERSION = pappyproxy.__version__
setup(name='pappyproxy',
version=VERSION,

Loading…
Cancel
Save