Release 0.0.1

See README diff for changes
master
Rob Glew 9 years ago
parent 6633423420
commit 4e6801e4d8
  1. 3
      .gitignore
  2. 3
      MANIFEST.in
  3. 62
      README.md
  4. 3
      pappy-proxy/.coveragerc
  5. 22
      pappy-proxy/certs/certificate.crt
  6. 28
      pappy-proxy/certs/private.key
  7. 51
      pappy-proxy/config.py
  8. 56
      pappy-proxy/tests/test_proxy.py
  9. 42
      pappy-proxy/tests/testutil.py
  10. 1
      pappy-proxy/vim_repeater/.#repeater.vim
  11. 3
      pappyproxy/.coveragerc
  12. 2
      pappyproxy/Makefile
  13. 0
      pappyproxy/__init__.py
  14. 0
      pappyproxy/__main__.py
  15. 10
      pappyproxy/comm.py
  16. 71
      pappyproxy/config.py
  17. 273
      pappyproxy/console.py
  18. 16
      pappyproxy/context.py
  19. 0
      pappyproxy/default_user_config.json
  20. 148
      pappyproxy/http.py
  21. 29
      pappyproxy/mangle.py
  22. 59
      pappyproxy/pappy.py
  23. 50
      pappyproxy/proxy.py
  24. 3
      pappyproxy/repeater.py
  25. 0
      pappyproxy/schema/__init__.py
  26. 0
      pappyproxy/schema/schema_1.py
  27. 2
      pappyproxy/schema/schema_2.py
  28. 5
      pappyproxy/schema/update.py
  29. 0
      pappyproxy/tests/__init__.py
  30. 4
      pappyproxy/tests/test_context.py
  31. 68
      pappyproxy/tests/test_http.py
  32. 220
      pappyproxy/tests/test_proxy.py
  33. 42
      pappyproxy/tests/testutil.py
  34. 0
      pappyproxy/util.py
  35. 0
      pappyproxy/vim_repeater/__init__.py
  36. 0
      pappyproxy/vim_repeater/repeater.py
  37. 0
      pappyproxy/vim_repeater/repeater.vim
  38. 2
      setup.cfg
  39. 38
      setup.py

3
.gitignore vendored

@ -1,5 +1,5 @@
*.pyc
certs/*
pappyproxy/certs/*
debug_out/*
data.db
.coverage
@ -9,3 +9,4 @@ tests/.cache
TAGS
config.json
build/*
*.egg-info/*

@ -0,0 +1,3 @@
include README.md
include LICENSE.txt
recursive-include pappyproxy *.py *.vim

@ -3,7 +3,7 @@ The Pappy Proxy
Introduction
------------
The Pappy (**P**roxy **A**ttack **P**roxy **P**rox**Y**) Proxy is an intercepting proxy for performing web application security testing. Its features are often similar, or straight up rippoffs from [Burp Suite](https://portswigger.net/burp/). However, Burp Suite is neither open source nor a command line tool, thus making a proxy like Pappy inevitable. The project is still in its early stages, so there are bugs and not a ton of features, but it should be ready for the bigtime soon (I'm already trying to use it as a replacement for Burp Suite).
The Pappy (**P**roxy **A**ttack **P**roxy **P**rox**Y**) Proxy is an intercepting proxy for performing web application security testing. Its features are often similar, or straight up rippoffs from [Burp Suite](https://portswigger.net/burp/). However, Burp Suite is neither open source nor a command line tool, thus making a proxy like Pappy inevitable. The project is still in its early stages, so there are bugs and only the bare minimum features, but it should be able to do some cool stuff soon (I'm already using it for real<sup>tm</sup> work).
Contributing
------------
@ -16,11 +16,11 @@ How to Use It
Installation
------------
Installation requires `pip` or some other command that can handle a `setup.py` with requirements. Once the requirements are installed, you can run the `pappy.py` script to run the proxy. You're on your own to link it somewhere in your PATH.
Pappy supports OS X and Linux (sorry Windows). Installation requires `pip` or some other command that can handle a `setup.py` with requirements. Once the requirements are installed, you can check that it installed correctly by running `pappy -l` to start the proxy.
```
$ git clone https://github.com/roglew/pappy-proxy.git
$ cd pappy-proxy
$ pip install -e .
$ pip install .
```
Quickstart
@ -30,9 +30,8 @@ Pappy projects take up an entire directory. While a full directory may seem like
```
$ mkdir test_project
$ cd test_project
$ /path/to/pappy.py
$ pappy
Copying default config to directory
Updating schema to version 1
Proxy is listening on port 8000
itsPappyTime> exit
$ ls
@ -42,6 +41,20 @@ $
And that's it! The proxy will by default be running on port 8000 and bound to localhost (to keep the hackers out). You can modify the port/interface in `config.json`. You can list all your intercepted requests with `ls`, view a full request with `vfq <reqid>` or view a full response with `vfs <reqid>`. No you can't delete them yet. I'm working on it.
Lite Mode
---------
If you don't want to dirty up a directory, you can run Pappy in "lite" mode. Pappy will use the default configuration settings and will create a temporary datafile in `/tmp` to use. When you quit, the file will be deleted. If you want to run Pappy in line mode, run Pappy with either `-l` or `--lite`.
Example:
```
$ pappy -l
Temporary datafile is /tmp/tmpw4mGv2
Proxy is listening on port 8000
itsPappyTime> quit
Deleting temporary datafile
$
```
Adding The CA Cert to Your Browser
----------------------------------
In order for Pappy to view data sent using HTTPS, you need to add a generated CA cert (`certificate.crt`) to your browser. Certificates are generated using the `gencerts` command and are by default stored in the same directory as `pappy.py`. This allows Pappy to act as a CA and MITM HTTPS connections. I believe that Firefox and Chrome ignore keychain/system certs, so you will have to install the CA cert to the browsers instead of (or in addition to) adding the cert to your keychain.
@ -56,7 +69,7 @@ You can add the CA cert to Chrome by going to `Settings -> Show advanced setting
For Safari (on macs, obviously), you need to add the CA cert to your system keychain. You can do this by double clicking on the CA cert and following the prompts.
### Internet Explorer
I didn't search too hard for instructions on this and I don't own a Windows machine to try this, so if you have trouble, hit me up and I'll see if I can help and add real instructions. According to Google you can double-click the cert to install it to the system, or you can do `Tools -> Content -> Certificates -> Trusted Root Certificates -> Import`
I didn't search too hard for instructions on this (since Pappy doesn't support windows) and I don't own a Windows machine to try this, so if you have trouble, I'm not the one to ask. According to Google you can double-click the cert to install it to the system, or you can do `Tools -> Content -> Certificates -> Trusted Root Certificates -> Import`.
Configuration
-------------
@ -64,16 +77,16 @@ Configuration for each project is done in the `config.json` file. The file is a
| Key | Value |
|:--|:--|
| data_file | The file where requests and images will be stored |
| debug_dir (optional) | Where connection debug info should be stored. If not present, debug info is not saved to a file. |
| cert_dir | Where the CA cert and the private key for the CA cert are stored |
| proxy_listeners | A list of dicts which describe which ports the proxy will listen on. Each item is a dict with "port" and "interface" values which determine which port and interface to listen on. For example, if port=8000 and the interface is 127.0.0.1, the proxy will only accept connections from localhost on port 8000. To accept connections from anywhere, set the interface to 0.0.0.0. |
| `data_file` | The file where requests and images will be stored |
| `debug_dir` (optional) | Where connection debug info should be stored. If not present, debug info is not saved to a file. |
| `cert_dir` | Where the CA cert and the private key for the CA cert are stored |
| `proxy_listeners` | A list of dicts which describe which ports the proxy will listen on. Each item is a dict with "port" and "interface" values which determine which port and interface to listen on. For example, if port=8000 and the interface is 127.0.0.1, the proxy will only accept connections from localhost on port 8000. To accept connections from anywhere, set the interface to 0.0.0.0. |
The following tokens will also be replaced with values:
| Token | Replaced with |
|:--|:--|
| {PAPPYDIR} | The directory where Pappy's files are stored |
| `{PAPPYDIR}` | The directory where Pappy's files are stored |
Generating Pappy's CA Cert
--------------------------
@ -89,7 +102,8 @@ The following commands can be used to view requests and responses
| Command | Aliases | Description |
|:--------|:--------|:------------|
| `ls [a|<num>]`| list, ls |List requests that are in the current context (see Context section). Has information like the host, target path, and status code. With no arguments, it will print the 50 most recent requests in the current context. If you pass 'a' or 'all' as an argument, it will print all the requests in the current context. If you pass a number "n" as an argument, it will print the n most recent requests in the current context. |
| `ls [a|<num>`]| list, ls |List requests that are in the current context (see Context section). Has information like the host, target path, and status code. With no arguments, it will print the 25 most recent requests in the current context. If you pass 'a' or 'all' as an argument, it will print all the requests in the current context. If you pass a number "n" as an argument, it will print the n most recent requests in the current context. |
| `viq <id> [u]` | view_request_info, viq | View additional information about a request. Includes the target port, if SSL was used, and other information. If 'u' is given as an additional argument, it will print information on the unmangled version of the request. |
| `vfq <id> [u]` | view_full_request, vfq | [V]iew [F]ull Re[Q]uest, prints the full request including headers and data. If 'u' is given as an additional argument, it will print the unmangled version of the request. |
| `vhq <id> [u]` | view_request_headers, vhq | [V]iew [H]eaders of a Re[Q]uest. Prints just the headers of a request. If 'u' is given as an additional argument, it will print the unmangled version of the request. |
| `vfs <id> [u]` | view_full_response, vfs |[V]iew [F]ull Re[S]ponse, prints the full response associated with a request including headers and data. If 'u' is given as an additional argument, it will print the unmangled version of the response. |
@ -115,7 +129,7 @@ The context is a set of filters that define which requests are considered "activ
| Command | Aliases | Description |
|:--------|:------------|:---|
| `fl <filter string>` | filter, fl |Add a filter that limits which requests are included in the current context. See the Filter String section for how to create a filter string |
| `f <filter string>` | filter, fl, f |Add a filter that limits which requests are included in the current context. See the Filter String section for how to create a filter string |
| `fc` | filter_clear, fc | Clears the filters and resets the context to contain all requests and responses. Ignores scope |
| `fls` | filter_list, fls | Print the filters that make up the current context |
@ -184,12 +198,12 @@ Matches both A and B but not C
| contains | contains, ct | A contain B is true if B is a substring of A |
| containsr | containsr, ctr | A containr B is true if A matches regexp B (NOT IMPLEMENTED) |
| exists | exists, ex | A exists B if A is not an empty string (likely buggy) |
| Leq | Leq, L= | A Leq B if A's length equals B (B must be a number) |
| Lgt | Lgt, L> | A Lgt B if A's length is greater than B (B must be a number ) |
| Llt | Llt, L< | A Llt B if A's length is less than B (B must be a number) |
| eq | eq, = | A eq B if A = B (A and B must be a number) |
| gt | gt, > | A gt B if A > B (A and B must be a number) |
| lt | lt, < | A lt B if A < B (A and B must be a number) |
| Leq | Leq | A Leq B if A's length equals B (B must be a number) |
| Lgt | Lgt | A Lgt B if A's length is greater than B (B must be a number ) |
| Llt | Llt | A Llt B if A's length is less than B (B must be a number) |
| eq | eq | A eq B if A = B (A and B must be a number) |
| gt | gt | A gt B if A > B (A and B must be a number) |
| lt | lt | A lt B if A < B (A and B must be a number) |
Scope
-----
@ -207,13 +221,15 @@ Any requests which don't match all the filters in the scope will be passed strai
Interceptor
-----------
This feature is like Burp's proxy with "Intercept Mode" turned on, except it's not turned on unless you explicitly turn it on. When the proxy gets a request while in intercept mode, it lets you edit it with vim before it forwards it to the server. In addition, it can stop responses from the server and let you edit them with vim before they get forwarded to the browser. When you run the command, you can pass `request` and/or `response` as arguments to say whether you would like to intercept requests and/or responses. Only in-scope requests/responses will be intercepted (see Scope section)
This feature is like Burp's proxy with "Intercept Mode" turned on, except it's not turned on unless you explicitly turn it on. When the proxy gets a request while in intercept mode, it lets you edit it before it forwards it to the server. In addition, it can stop responses from the server and let you edit them before they get forwarded to the browser. When you run the command, you can pass `request` and/or `response` as arguments to say whether you would like to intercept requests and/or responses. Only in-scope requests/responses will be intercepted (see Scope section).
The interceptor will use your EDITOR variable to decide which editor to edit the request/response with. If no editor variable is set, it will default to `vi`.
To forward a request, edit it, save the file, then quit.
| Command | Aliases | Description |
|:--------|:--------|:------------|
| `ic <requests, responses, request, response, req, rsp>+` | intercept, ic | Begins interception mode. Press enter to leave interception mode and return to the command prompt. Pass in `request` to intercept requests, `response` to intercept responses, or both to intercept both. |
| `ic <requests,responses,request,response,req,rsp>+` | intercept, ic | Begins interception mode. Press enter to leave interception mode and return to the command prompt. Pass in `request` to intercept requests, `response` to intercept responses, or both to intercept both. |
```
Intercept both requests and responses:
@ -236,6 +252,8 @@ Repeater
--------
This feature is like Burp's repeater (yes, really). You choose a request and Pappy will open vim in a split window with your request on the left and the original response on the right. You can make changes to the request and then run ":RepeaterSubmitBuffer" to submit the modified request. The response will be displayed on the right. This command is bound to `<leader>f` by default, but you can rebind it in your vimrc (I think, dunno if vim will complain if it's undefined). This command will submit whatever buffer your cursor is in, so make sure it's in the request buffer.
To drop a request, delete everything, save and quit (`ggdG:wq`).
When you're done with repeater, run ":qa!" to avoid having to save changes to nonexistent files.
| Command | Aliases | Description |
@ -244,7 +262,7 @@ When you're done with repeater, run ":qa!" to avoid having to save changes to no
| Vim Command | Keybinding | Action |
|:--------|:-----------|:-------|
| RepeaterSubmitBuffer | `<leader>f` | Submit the current buffer, split the windows vertically, and show the result in the right window |
| `RepeaterSubmitBuffer` | <leader>f | Submit the current buffer, split the windows vertically, and show the result in the right window |
Logging
-------

@ -1,3 +0,0 @@
[run]
omit = tests/*, schema/*

@ -1,22 +0,0 @@
-----BEGIN CERTIFICATE-----
MIIDjzCCAncCFQDmrLdMg37vTWXeF9Zp0WjQmQWF1jANBgkqhkiG9w0BAQsFADBg
MQswCQYDVQQGEwJVUzERMA8GA1UECAwITWljaGlnYW4xEjAQBgNVBAcMCUFubiBB
cmJvcjEUMBIGA1UECgwLUGFwcHkgUHJveHkxFDASBgNVBAMMC1BhcHB5IFByb3h5
MB4XDTE1MTAyNjE2MDYxMVoXDTI1MTAyMzE2MDYxMVowYDELMAkGA1UEBhMCVVMx
ETAPBgNVBAgMCE1pY2hpZ2FuMRIwEAYDVQQHDAlBbm4gQXJib3IxFDASBgNVBAoM
C1BhcHB5IFByb3h5MRQwEgYDVQQDDAtQYXBweSBQcm94eTCCASIwDQYJKoZIhvcN
AQEBBQADggEPADCCAQoCggEBAPNQo64jLgvKVKNqqLi0cDBfWqp+ZhEDaGdm3Rjl
AFerqmDHyAeCu1GENQAwcmmeXCwMYSbjcMHSrExR+rcQRxvJ8OOp2doP43+T9hd8
rZt+PPOiBVG0cUrfdsVdbUyGjPmZFtWaiSVG2gUOdO2m7jK5WwIEcW5u6vEfmgco
/JLvtdgGZGIlsZGeQGcJdeZ6LaPKLHxPAkgRQduQTpK5nKiFi0Aqj4AsqddcZ4fo
X3zGsypkt0NVTn4nMZLR9Ml5mwzTltr9BBtSVqMIMwqVkKLkGFdaIFsY5dK3UYUV
vqLGB6ubheULLjmkv9FJLmaHfnLb2jjA17K+y3QKosMVldcCAwEAAaNFMEMwEgYD
VR0TAQH/BAgwBgEB/wIBADAOBgNVHQ8BAf8EBAMCAQYwHQYDVR0OBBYEFNo5o+5e
a0sNMlW/75VgGJCv2AcJMA0GCSqGSIb3DQEBCwUAA4IBAQBMbpA8XkEvtpErHsy/
FCtzQGmn88idU43fFSi0bcsWWc1ekapd7iTramItvZ8OCZD3/oVE4VIwumuJuoVk
OU/Tip0e+haPV5f1JImdsk2f20WJ0lJ5CyrrRcddqgVrcQbB8DwaJSJRXzrSD9Cp
UDfJhIh2zxRolGql29X6QiFukV3CIHn2hF+QYlMrxkoI0e4r6sDtmN4/VccgADdH
pQeVz4z/ZxKBIh7Xol8K6Qr+gXnlkbp3n5WXGHbv4YsK995z9yVZpuLPUHbpnSzr
KVJ5I4joA22uc2tqeKvfp4QsE8fa/nVNRv/LZZeCdg0zrXXpE9RoxNirwEcQwAo1
x25g
-----END CERTIFICATE-----

@ -1,28 +0,0 @@
-----BEGIN PRIVATE KEY-----
MIIEvwIBADANBgkqhkiG9w0BAQEFAASCBKkwggSlAgEAAoIBAQDzUKOuIy4LylSj
aqi4tHAwX1qqfmYRA2hnZt0Y5QBXq6pgx8gHgrtRhDUAMHJpnlwsDGEm43DB0qxM
Ufq3EEcbyfDjqdnaD+N/k/YXfK2bfjzzogVRtHFK33bFXW1Mhoz5mRbVmoklRtoF
DnTtpu4yuVsCBHFuburxH5oHKPyS77XYBmRiJbGRnkBnCXXmei2jyix8TwJIEUHb
kE6SuZyohYtAKo+ALKnXXGeH6F98xrMqZLdDVU5+JzGS0fTJeZsM05ba/QQbUlaj
CDMKlZCi5BhXWiBbGOXSt1GFFb6ixgerm4XlCy45pL/RSS5mh35y29o4wNeyvst0
CqLDFZXXAgMBAAECggEBAJxlD+ClkjpX4lFsBGk86gPdtrxyJI74/snAD4up3q97
kzdEEuno+Rhrf1nQyinjdWGGz4ecl+St0rv30cyLdPmCswjTK0mD/voJFByCsmCJ
IwqC8SJUdqHmw0QXSmLu9XyWD1xbSZ4hTZAEe9op+1+1Tq8cRgDy4Kb+ZhYGHVsf
4o1RFGBCtSGLFBC908xZnQlqzGHtCuiBecJiWqoFK+mm3TgEUp4VDPRSPsWDWYnJ
KxciTSE9roBF7VAe5ocTRdn+tj9GVaNaBLqb1XhkU41wZxVMoid0OVgxkmyEdAyR
lL1/zVyQDgJbke4t6dgu4NCAoPWXKZP1zxNa1Ied51kCgYEA+h2X7MO8rYyWHGT7
EZoPpHSrR3F1MnsRgXnkVt5dSrwAQlLmQmmWnjVtEQM72Eox1Czdz+GjILpvfwNF
fktzDa1GghO5TdDibcchG01qLeqEj0vgvtCP1YFLeCBZJv4yPxpaHWhyUOYPWoXq
Mze7yYbkh2uYORPKgu+N4b4oH90CgYEA+QoWQ+44j2jld4DLvYpW/tf2kvKkmFl5
43KSVXkDHSnEfO+RFpFQ8rCOKetlMbcuQMakTz++fh3smHWGZ/S1Hm1ZUIRQqCzq
m1dTg8PX6pH9e7/0gebFqQWtGhWQdnSWmGZAEnAnmFq6DrDB0FHvfS+VePC1knEJ
/Aw4l+YFy0MCgYA60YLM1ysj1Q/oFYdFmGldT2KIJpJdELwJKtUb6Kcf0B5vendT
3ujgw8emXJBSSQB22SZAoNtv8ugNgoNxM+UWrk0KggDt39Wf41hRx17U9XW/DSUJ
OprYptNMqK7OkLDYTiYrDEj15WRu8VcmPFEZD3PmtNLTeWgCart+/u0IsQKBgQCG
xSirdl1xbmjPtQmM9zKBE0pC18CvGazWo4gBbU18GMBWhCbWOam+zEEC+np23xTO
xTDiGjLyeSsyjldAJrNlVfPBmPk1KamEi0uMwQ01ye+NaqHdMo/BGmtE9GqLUCi3
LI576+nhjyelD46zN8QM0RVor4rzRu0KU2rE+RwllQKBgQDZ1j5Uhblxn+WJ1/z3
xZfP23VJLVCCvBIXaHENCl01/9hSBFqH0K+EUUfeJesWoh7KSdaiHXGRR1XdB1rs
Bmzh4wPgIlcc8CPmJxZ09fM2ggHSZf1baV8lEf64/N3OnENDvUAepzwIe0IhKs1i
pzpCgCGttWxEZJvcug4AOulfQA==
-----END PRIVATE KEY-----

@ -1,51 +0,0 @@
import imp
import json
import os
import shutil
# Make sure we have a config file
if not os.path.isfile('./config.json'):
print "Copying default config to directory"
default_config_file = os.path.join(os.path.dirname(os.path.realpath(__file__)),
'default_user_config.json')
shutil.copyfile(default_config_file, './config.json')
# Load local project config
with open('./config.json', 'r') as f:
proj_config = json.load(f)
# Substitution dictionary
subs = {}
subs['PAPPYDIR'] = os.path.dirname(os.path.realpath(__file__))
# Data file settings
if 'data_file' in proj_config:
DATAFILE = proj_config["data_file"].format(**subs)
else:
DATAFILE = 'data.db'
# Debug settings
if 'debug_dir' in proj_config:
DEBUG_TO_FILE = True
DEBUG_DIR = proj_config["debug_dir"].format(**subs)
else:
DEBUG_DIR = None
DEBUG_TO_FILE = False
DEBUG_VERBOSITY = 0
# Cert directory settings
if 'cert_dir' in proj_config:
CERT_DIR = proj_config["cert_dir"].format(**subs)
else:
CERT_DIR = './certs'
SSL_PKEY_FILE = 'private.key'
SSL_CA_FILE = 'certificate.crt'
# Listener settings
if "proxy_listeners" in proj_config:
LISTENERS = []
for l in proj_config["proxy_listeners"]:
LISTENERS.append((l['port'], l['interface']))
else:
LISTENERS = [(8000, '127.0.0.1')]

@ -1,56 +0,0 @@
import pytest
import mangle
import twisted.internet
import twisted.test
from proxy import ProxyClient, ProxyClientFactory, ProxyServer
from testutil import mock_deferred, func_deleted, no_tcp, ignore_tcp, no_database, func_ignored
from twisted.internet.protocol import ServerFactory
from twisted.test.iosim import FakeTransport
from twisted.internet import defer, reactor
####################
## Fixtures
@pytest.fixture
def proxyserver(monkeypatch):
monkeypatch.setattr("twisted.test.iosim.FakeTransport.startTLS", func_ignored)
factory = ServerFactory()
factory.protocol = ProxyServer
protocol = factory.buildProtocol(('127.0.0.1', 0))
protocol.makeConnection(FakeTransport(protocol, True))
return protocol
## Autorun fixtures
@pytest.fixture(autouse=True)
def no_mangle(monkeypatch):
# Don't call anything in mangle.py
monkeypatch.setattr("mangle.mangle_request", func_deleted)
monkeypatch.setattr("mangle.mangle_response", func_deleted)
####################
## Unit test tests
def test_proxy_server_fixture(proxyserver):
proxyserver.transport.write('hello')
assert proxyserver.transport.getOutBuffer() == 'hello'
@pytest.inlineCallbacks
def test_mock_deferreds(mock_deferred):
d = mock_deferred('Hello!')
r = yield d
assert r == 'Hello!'
def test_deleted():
with pytest.raises(NotImplementedError):
reactor.connectTCP("www.google.com", "80", ServerFactory)
####################
## Proxy Server Tests
def test_proxy_server_connect(proxyserver):
proxyserver.lineReceived('CONNECT www.dddddd.fff:433 HTTP/1.1')
proxyserver.lineReceived('')
assert proxyserver.transport.getOutBuffer() == 'HTTP/1.1 200 Connection established\r\n\r\n'
#assert starttls got called

@ -1,42 +0,0 @@
import pytest
from twisted.internet import defer
class ClassDeleted():
pass
def func_deleted(*args, **kwargs):
raise NotImplementedError()
def func_ignored(*args, **kwargs):
pass
@pytest.fixture
def mock_deferred():
# Generates a function that can be used to make a deferred that can be used
# to mock out deferred-returning responses
def f(value):
def g(data):
return value
d = defer.Deferred()
d.addCallback(g)
d.callback(None)
return d
return f
@pytest.fixture(autouse=True)
def no_tcp(monkeypatch):
# Don't make tcp connections
monkeypatch.setattr("twisted.internet.reactor.connectTCP", func_deleted)
monkeypatch.setattr("twisted.internet.reactor.connectSSL", func_deleted)
@pytest.fixture
def ignore_tcp(monkeypatch):
# Don't make tcp connections
monkeypatch.setattr("twisted.internet.reactor.connectTCP", func_ignored)
monkeypatch.setattr("twisted.internet.reactor.connectSSL", func_ignored)
@pytest.fixture(autouse=True)
def no_database(monkeypatch):
# Don't make database queries
monkeypatch.setattr("twisted.enterprise.adbapi.ConnectionPool",
ClassDeleted)

@ -1 +0,0 @@
glew@localhost.787:1446907770

@ -0,0 +1,3 @@
[run]
omit = tests/*, schema/*, console.py, vim_repeater/*

@ -1,6 +1,6 @@
install-third-party:
pip install -r requirements.txt
pip install -e ..
test:
py.test -rw --twisted --cov-config .coveragerc --cov=. tests/

@ -1,6 +1,6 @@
import base64
import http
import json
import pappyproxy
from twisted.protocols.basic import LineReceiver
from twisted.internet import defer
@ -74,7 +74,7 @@ class CommServer(LineReceiver):
except KeyError:
raise PappyException("Request with given ID does not exist")
req = yield http.Request.load_request(reqid)
req = yield pappyproxy.http.Request.load_request(reqid)
dat = json.loads(req.to_json())
defer.returnValue(dat)
@ -85,9 +85,9 @@ class CommServer(LineReceiver):
except KeyError:
raise PappyException("Request with given ID does not exist, cannot fetch associated response.")
req = yield http.Request.load_request(reqid)
req = yield pappyproxy.http.Request.load_request(reqid)
if req.response:
rsp = yield http.Response.load_response(req.response.rspid)
rsp = yield pappyproxy.http.Response.load_response(req.response.rspid)
dat = json.loads(rsp.to_json())
else:
dat = {}
@ -96,7 +96,7 @@ class CommServer(LineReceiver):
@defer.inlineCallbacks
def action_submit_request(self, data):
try:
req = http.Request(base64.b64decode(data['full_request']))
req = pappyproxy.http.Request(base64.b64decode(data['full_request']))
req.port = data['port']
req.is_ssl = data['is_ssl']
except:

@ -0,0 +1,71 @@
import imp
import json
import os
import shutil
PAPPY_DIR = os.path.dirname(os.path.realpath(__file__))
CERT_DIR = PAPPY_DIR
DATAFILE = 'data.db'
DEBUG_DIR = None
DEBUG_TO_FILE = False
DEBUG_VERBOSITY = 0
LISTENERS = [(8000, '127.0.0.1')]
SSL_CA_FILE = 'certificate.crt'
SSL_PKEY_FILE = 'private.key'
def get_default_config():
default_config_file = os.path.join(os.path.dirname(os.path.realpath(__file__)),
'default_user_config.json')
with open(default_config_file) as f:
settings = json.load(f)
return settings
def load_settings(proj_config):
global CERT_DIR
global DATAFILE
global DEBUG_DIR
global DEBUG_TO_FILE
global DEBUG_VERBOSITY
global LISTENERS
global PAPPY_DIR
global SSL_CA_FILE
global SSL_PKEY_FILE
# Substitution dictionary
subs = {}
subs['PAPPYDIR'] = PAPPY_DIR
# Data file settings
if 'data_file' in proj_config:
DATAFILE = proj_config["data_file"].format(**subs)
# Debug settings
if 'debug_dir' in proj_config:
if proj_config['debug_dir']:
DEBUG_TO_FILE = True
DEBUG_DIR = proj_config["debug_dir"].format(**subs)
# Cert directory settings
if 'cert_dir' in proj_config:
CERT_DIR = proj_config["cert_dir"].format(**subs)
# Listener settings
if "proxy_listeners" in proj_config:
LISTENERS = []
for l in proj_config["proxy_listeners"]:
LISTENERS.append((l['port'], l['interface']))
def load_from_file(fname):
# Make sure we have a config file
if not os.path.isfile(fname):
print "Copying default config to %s" % fname
default_config_file = os.path.join(os.path.dirname(os.path.realpath(__file__)),
'default_user_config.json')
shutil.copyfile(default_config_file, fname)
# Load local project config
with open(fname, 'r') as f:
proj_config = json.load(f)
load_settings(proj_config)

@ -1,11 +1,10 @@
import cmd2
import config
import context
import crochet
import mangle
import proxy
import repeater
import select
import curses
import datetime
import os
import pappyproxy
import pygments
import shlex
import string
import subprocess
@ -13,9 +12,10 @@ import sys
import termios
import time
import http
from twisted.internet import defer, reactor
from util import PappyException
from pappyproxy.util import PappyException
from pygments.lexers import get_lexer_for_mimetype
from pygments.formatters import TerminalFormatter
"""
console.py
@ -57,6 +57,37 @@ class ProxyCmd(cmd2.Cmd):
self.alerts = []
return stop
def help_view_request_headers(self):
print ("View information about request\n"
"Usage: view_request_info <reqid> [u]"
"If 'u' is given as an additional argument, the unmangled version "
"of the request will be displayed.")
@print_pappy_errors
@crochet.wait_for(timeout=30.0)
@defer.inlineCallbacks
def do_view_request_info(self, line):
args = shlex.split(line)
try:
reqid = int(args[0])
showid = reqid
except:
raise PappyException("Enter a valid number for the request id")
req = yield pappyproxy.http.Request.load_request(reqid)
showreq = req
show_unmangled = False
if len(args) > 1 and args[1][0].lower() == 'u':
if not req.unmangled:
raise PappyException("Request was not mangled")
show_unmangled = True
showreq = req.unmangled
print ''
print_request_extended(showreq)
print ''
def help_view_request_headers(self):
print ("View the headers of the request\n"
"Usage: view_request_headers <reqid> [u]"
@ -74,7 +105,7 @@ class ProxyCmd(cmd2.Cmd):
except:
raise PappyException("Enter a valid number for the request id")
req = yield http.Request.load_request(reqid)
req = yield pappyproxy.http.Request.load_request(reqid)
showreq = req
show_unmangled = False
@ -84,10 +115,7 @@ class ProxyCmd(cmd2.Cmd):
show_unmangled = True
showreq = req.unmangled
print ''
print_requests([showreq])
if show_unmangled:
print ''
print 'UNMANGLED --------------------'
print ''
view_full_request(showreq, True)
@ -109,7 +137,7 @@ class ProxyCmd(cmd2.Cmd):
except:
raise PappyException("Enter a valid number for the request id")
req = yield http.Request.load_request(reqid)
req = yield pappyproxy.http.Request.load_request(reqid)
showreq = req
show_unmangled = False
@ -119,10 +147,7 @@ class ProxyCmd(cmd2.Cmd):
show_unmangled = True
showreq = req.unmangled
print ''
print_requests([showreq])
if show_unmangled:
print ''
print 'UNMANGLED --------------------'
print ''
view_full_request(showreq)
@ -142,7 +167,7 @@ class ProxyCmd(cmd2.Cmd):
except:
raise PappyException("Enter a valid number for the request id")
req = yield http.Request.load_request(reqid)
req = yield pappyproxy.http.Request.load_request(reqid)
showrsp = req.response
show_unmangled = False
@ -152,8 +177,6 @@ class ProxyCmd(cmd2.Cmd):
show_unmangled = True
showrsp = req.response.unmangled
print ''
print_requests([req])
if show_unmangled:
print ''
print 'UNMANGLED --------------------'
@ -175,7 +198,7 @@ class ProxyCmd(cmd2.Cmd):
except:
raise PappyException("Enter a valid number for the request id")
req = yield http.Request.load_request(reqid)
req = yield pappyproxy.http.Request.load_request(reqid)
showrsp = req.response
show_unmangled = False
@ -185,8 +208,6 @@ class ProxyCmd(cmd2.Cmd):
show_unmangled = True
showrsp = req.response.unmangled
print ''
print_requests([req])
if show_unmangled:
print ''
print 'UNMANGLED --------------------'
@ -212,14 +233,14 @@ class ProxyCmd(cmd2.Cmd):
else:
print_count = 25
context.sort()
pappyproxy.context.sort()
if print_count > 0:
to_print = context.active_requests[:]
to_print = pappyproxy.context.active_requests[:]
to_print = sorted(to_print, key=lambda x: x.reqid, reverse=True)
to_print = to_print[:print_count]
print_requests(to_print)
else:
print_requests(context.active_requests)
print_requests(pappyproxy.context.active_requests)
def help_filter(self):
print ("Apply a filter to the current context\n"
@ -231,8 +252,8 @@ class ProxyCmd(cmd2.Cmd):
if not line:
raise PappyException("Filter string required")
filter_to_add = context.Filter(line)
context.add_filter(filter_to_add)
filter_to_add = pappyproxy.context.Filter(line)
pappyproxy.context.add_filter(filter_to_add)
def help_filter_clear(self):
print ("Reset the context so that it contains no filters (ignores scope)\n"
@ -242,8 +263,8 @@ class ProxyCmd(cmd2.Cmd):
@crochet.wait_for(timeout=30.0)
@defer.inlineCallbacks
def do_filter_clear(self, line):
context.active_filters = []
yield context.reload_from_storage()
pappyproxy.context.active_filters = []
yield pappyproxy.context.reload_from_storage()
def help_filter_list(self):
print ("Print the filters that make up the current context\n"
@ -251,7 +272,7 @@ class ProxyCmd(cmd2.Cmd):
@print_pappy_errors
def do_filter_list(self, line):
for f in context.active_filters:
for f in pappyproxy.context.active_filters:
print f.filter_string
@ -263,8 +284,8 @@ class ProxyCmd(cmd2.Cmd):
@crochet.wait_for(timeout=30.0)
@defer.inlineCallbacks
def do_scope_save(self, line):
context.save_scope()
yield context.store_scope(http.dbpool)
pappyproxy.context.save_scope()
yield pappyproxy.context.store_scope(pappyproxy.http.dbpool)
def help_scope_reset(self):
print ("Set the context to be the scope (view in-scope items)\n"
@ -274,7 +295,7 @@ class ProxyCmd(cmd2.Cmd):
@crochet.wait_for(timeout=30.0)
@defer.inlineCallbacks
def do_scope_reset(self, line):
yield context.reset_to_scope()
yield pappyproxy.context.reset_to_scope()
def help_scope_delete(self):
print ("Delete the scope so that it contains all request/response pairs\n"
@ -284,8 +305,8 @@ class ProxyCmd(cmd2.Cmd):
@crochet.wait_for(timeout=30.0)
@defer.inlineCallbacks
def do_scope_delete(self, line):
context.set_scope([])
yield context.store_scope(http.dbpool)
pappyproxy.context.set_scope([])
yield pappyproxy.context.store_scope(pappyproxy.http.dbpool)
def help_scope_list(self):
print ("Print the filters that make up the scope\n"
@ -293,7 +314,7 @@ class ProxyCmd(cmd2.Cmd):
@print_pappy_errors
def do_scope_list(self, line):
context.print_scope()
pappyproxy.context.print_scope()
def help_repeater(self):
print ("Open a request in the repeater\n"
@ -312,7 +333,7 @@ class ProxyCmd(cmd2.Cmd):
umid = get_unmangled(reqid)
if umid is not None:
repid = umid
repeater.start_editor(repid)
pappyproxy.repeater.start_editor(repid)
def help_submit(self):
print "Submit a request again (NOT IMPLEMENTED)"
@ -328,7 +349,7 @@ class ProxyCmd(cmd2.Cmd):
# print printable_data(rsp.full_response)
def help_intercept(self):
print ("Intercept requests and/or responses and edit them with vim before passing them along\n"
print ("Intercept requests and/or responses and edit them with before passing them along\n"
"Usage: intercept <reqid>")
@print_pappy_errors
@ -346,49 +367,80 @@ class ProxyCmd(cmd2.Cmd):
if any(a in rsp_names for a in args):
intercept_responses = True
if intercept_requests:
print "Intercepting reqeusts"
if intercept_responses:
print "Intercepting responses"
if intercept_requests and intercept_responses:
intercept_str = 'Requests and responses'
elif intercept_requests:
intercept_str = 'Requests'
elif intercept_responses:
intercept_str = 'Responses'
else:
intercept_str = 'NOTHING'
mangle.set_intercept_requests(intercept_requests)
mangle.set_intercept_responses(intercept_responses)
while 1:
if select.select([sys.stdin,],[],[],0.0)[0]:
break;
pappyproxy.mangle.set_intercept_requests(intercept_requests)
pappyproxy.mangle.set_intercept_responses(intercept_responses)
## Interceptor loop
stdscr = curses.initscr()
curses.noecho()
curses.cbreak()
try:
editnext = False
stdscr.nodelay(True)
while True:
stdscr.addstr(0, 0, "Currently intercepting: %s" % intercept_str)
stdscr.clrtoeol()
stdscr.addstr(1, 0, "%d item(s) in queue." % len(edit_queue))
stdscr.clrtoeol()
if editnext:
stdscr.addstr(2, 0, "Waiting for next item... Press 'q' to quit or 'b' to quit waiting")
else:
if len(edit_queue) > 0:
stdscr.addstr(2, 0, "Press 'n' to edit the next item or 'q' to quit interceptor.")
stdscr.clrtoeol()
c = stdscr.getch()
if c == ord('q'):
break
elif c == ord('n'):
editnext = True
elif c == ord('b'):
editnext = False
if editnext and edit_queue:
editnext = False
(to_edit, deferred) = edit_queue.pop(0)
# Edit the file
subprocess.call(['vim', to_edit])
# Fire the callback
editor = 'vi'
if 'EDITOR' in os.environ:
editor = os.environ['EDITOR']
subprocess.call([editor, to_edit])
stdscr.clear()
deferred.callback(None)
time.sleep(0.2)
finally:
curses.nocbreak()
stdscr.keypad(0)
curses.echo()
curses.endwin()
pappyproxy.mangle.set_intercept_requests(False)
pappyproxy.mangle.set_intercept_responses(False)
# Send remaining requests along
while len(edit_queue) > 0:
(fname, deferred) = edit_queue.pop(0)
deferred.callback(None)
# Flush stdin so that anything we typed doesn't go into the prompt
termios.tcflush(sys.stdin, termios.TCIOFLUSH)
mangle.set_intercept_requests(False)
mangle.set_intercept_responses(False)
def help_gencerts(self):
print ("Generate CA cert and private CA file\n"
"Usage: gencerts [/path/to/put/certs/in]")
@print_pappy_errors
def do_gencerts(self, line):
dest_dir = line or config.CERT_DIR
dest_dir = line or pappyproxy.config.CERT_DIR
print "This will overwrite any existing certs in %s. Are you sure?" % dest_dir
print "(y/N)",
answer = raw_input()
if not answer or answer[0].lower() != 'y':
return False
print "Generating certs to %s" % dest_dir
proxy.generate_ca_certs(dest_dir)
pappyproxy.proxy.generate_ca_certs(dest_dir)
def help_log(self):
print ("View the log\n"
@ -402,9 +454,9 @@ class ProxyCmd(cmd2.Cmd):
verbosity = int(line.strip())
except:
verbosity = 1
config.DEBUG_VERBOSITY = verbosity
pappyproxy.config.DEBUG_VERBOSITY = verbosity
raw_input()
config.DEBUG_VERBOSITY = 0
pappyproxy.config.DEBUG_VERBOSITY = 0
@print_pappy_errors
def do_testerror(self, line):
@ -437,6 +489,13 @@ class ProxyCmd(cmd2.Cmd):
def do_sls(self, line):
self.onecmd('scope_list %s' % line)
def help_viq(self):
self.help_view_request_info()
@print_pappy_errors
def do_viq(self, line):
self.onecmd('view_request_info %s' % line)
def help_vhq(self):
self.help_view_request_headers()
@ -512,11 +571,14 @@ class ProxyCmd(cmd2.Cmd):
def cmd_failure(cmd):
print "FAILURE"
def edit_file(fname):
def edit_file(fname, front=False):
global edit_queue
# Adds the filename to the edit queue. Returns a deferred that is fired once
# the file is edited and the editor is closed
d = defer.Deferred()
if front:
edit_queue = [(fname, d)] + edit_queue
else:
edit_queue.append((fname, d))
return d
@ -585,7 +647,7 @@ def printable_data(data):
@crochet.wait_for(timeout=30.0)
@defer.inlineCallbacks
def get_unmangled(reqid):
req = yield http.Request.load_request(reqid)
req = yield pappyproxy.http.Request.load_request(reqid)
if req.unmangled:
defer.returnValue(req.unmangled.reqid)
else:
@ -599,24 +661,36 @@ def view_full_request(request, headers_only=False):
print printable_data(request.full_request)
def view_full_response(response, headers_only=False):
def check_type(response, against):
if 'Content-Type' in response.headers and against in response.headers['Content-Type']:
return True
return False
if headers_only:
print printable_data(response.raw_headers)
else:
print printable_data(response.full_response)
print response.raw_headers,
to_print = printable_data(response.raw_data)
if 'content-type' in response.headers:
try:
lexer = get_lexer_for_mimetype(response.headers['content-type'].split(';')[0])
to_print = pygments.highlight(to_print, lexer, TerminalFormatter())
except ClassNotFound:
pass
print to_print
def print_requests(requests):
# Print a table with info on all the requests in the list
cols = [
{'name':'ID'},
{'name':'Method'},
{'name':'Verb'},
{'name': 'Host'},
{'name':'Path', 'width':40},
{'name':'S-Code'},
{'name':'Req Len'},
{'name':'Rsp Len'},
{'name':'Time'},
{'name': 'Prt'},
{'name': 'SSL'},
{'name':'Mngl'},
]
rows = []
@ -624,9 +698,9 @@ def print_requests(requests):
rid = request.reqid
method = request.verb
host = request.headers['host']
path = request.path
path = request.full_path
reqlen = len(request.raw_data)
rsplen = 'None'
rsplen = 'N/A'
mangle_str = '--'
if request.unmangled:
@ -656,6 +730,63 @@ def print_requests(requests):
is_ssl = 'NO'
rows.append([rid, method, host, path, response_code,
reqlen, rsplen, time_str, port, is_ssl, mangle_str])
reqlen, rsplen, time_str, mangle_str])
print_table(cols, rows)
def print_request_extended(request):
# Prints extended info for the request
title = "Request Info (reqid=%d)" % request.reqid
print title
print '-'*len(title)
reqlen = len(request.raw_data)
reqlen = '%d bytes' % reqlen
rsplen = 'No response'
mangle_str = 'Nothing mangled'
if request.unmangled:
mangle_str = 'Request'
if request.response:
response_code = str(request.response.response_code) + \
' ' + request.response.response_text
rsplen = len(request.response.raw_data)
rsplen = '%d bytes' % rsplen
if request.response.unmangled:
if mangle_str == 'Nothing mangled':
mangle_str = 'Response'
else:
mangle_str += ' and Response'
else:
response_code = ''
time_str = '--'
if request.time_start and request.time_end:
time_delt = request.time_end - request.time_start
time_str = "%.2f sec" % time_delt.total_seconds()
port = request.port
if request.is_ssl:
is_ssl = 'YES'
else:
is_ssl = 'NO'
if request.time_start:
time_made_str = request.time_start.strftime('%a, %b %d, %Y, %I:%M:%S %p')
else:
time_made_str = '--'
print 'Made on %s' % time_made_str
print 'ID: %d' % request.reqid
print 'Verb: %s' % request.verb
print 'Host: %s' % request.host
print 'Path: %s' % request.full_path
print 'Status Code: %s' % response_code
print 'Request Length: %s' % reqlen
print 'Response Length: %s' % rsplen
if request.response.unmangled:
print 'Unmangled Response Length: %s bytes' % len(request.response.unmangled.full_response)
print 'Time: %s' % time_str
print 'Port: %s' % request.port
print 'SSL: %s' % is_ssl
print 'Mangled: %s' % mangle_str

@ -1,6 +1,6 @@
from pappyproxy import http
from twisted.internet import defer
from util import PappyException
import http
import shlex
@ -384,20 +384,20 @@ def get_relation(s):
return cmp_contains
elif s in ("containsr", "ctr"):
# TODO
return None
raise PappyException("Contains (regexp) is not implemented yet. Sorry.")
elif s in ("exists", "ex"):
return cmp_exists
elif s in ("Leq", "L="):
elif s in ("Leq"):
return cmp_len_eq
elif s in ("Lgt", "L>"):
elif s in ("Lgt"):
return cmp_len_gt
elif s in ("Llt", "L<"):
elif s in ("Llt"):
return cmp_len_lt
elif s in ("eq", "="):
elif s in ("eq"):
return cmp_eq
elif s in ("gt", ">"):
elif s in ("gt"):
return cmp_gt
elif s in ("lt", "<"):
elif s in ("lt"):
return cmp_lt
raise FilterParseError("Invalid relation: %s" % s)

@ -1,18 +1,16 @@
import base64
import collections
import console
import context
import crochet
import datetime
import gzip
import json
import proxy
import pappyproxy
import re
import StringIO
import urlparse
import zlib
from twisted.internet import defer, reactor
from util import PappyException
from pappyproxy.util import PappyException
ENCODE_NONE = 0
ENCODE_DEFLATE = 1
@ -20,7 +18,7 @@ ENCODE_GZIP = 2
dbpool = None
class DataAlreadyComplete(Exception):
class DataAlreadyComplete(PappyException):
pass
def init(pool):
@ -64,6 +62,19 @@ def strip_leading_newlines(string):
string = string[1:]
return string
def consume_line(instr):
# returns (line, rest)
l = []
pos = 0
while pos < len(instr):
if instr[pos] == '\n':
if l and l[-1] == '\r':
l = l[:-1]
return (''.join(l), instr[pos+1:])
l.append(instr[pos])
pos += 1
return instr
class RepeatableDict:
"""
A dict that retains the order of items inserted and keeps track of
@ -415,6 +426,14 @@ class Request(object):
def status_line(self):
if not self.verb and not self.path and not self.version:
return ''
return '%s %s %s' % (self.verb, self.full_path, self.version)
@status_line.setter
def status_line(self, val):
self.handle_statusline(val)
@property
def full_path(self):
path = self.path
if self.get_params:
path += '?'
@ -428,11 +447,7 @@ class Request(object):
if self.fragment:
path += '#'
path += self.fragment
return '%s %s %s' % (self.verb, path, self.version)
@status_line.setter
def status_line(self, val):
self.handle_statusline(val)
return path
@property
def raw_headers(self):
@ -460,6 +475,32 @@ class Request(object):
self.update_from_data()
self.complete = True
@property
def url(self):
if self.is_ssl:
retstr = 'https://'
else:
retstr = 'http://'
retstr += self.host
if not ((self.is_ssl and self.port == 443) or \
(not self.is_ssl and self.port == 80)):
retstr += ':%d' % self.port
if self.path:
retstr += self.path
if self.get_params:
retstr += '?'
pairs = []
for p in self.get_params.all_pairs():
pairs.append('='.join(p))
retstr += '&'.join(pairs)
if self.fragment:
retstr += '#%s' % self.fragment
return retstr
@url.setter
def url(self, val):
self._handle_statusline_uri(val)
def set_dict_callbacks(self):
# Add callbacks to dicts
self.headers.set_modify_callback(self.update_from_text)
@ -473,22 +514,19 @@ class Request(object):
if full_request == '':
return
# We do redundant splits, but whatever
lines = full_request.splitlines()
for line in lines:
if self.headers_complete:
break
remaining = full_request
while remaining and not self.headers_complete:
line, remaining = consume_line(remaining)
self.add_line(line)
if not self.headers_complete:
self.add_line('')
if not self.complete:
data = full_request[self.header_len:]
if update_content_length:
self.raw_data = data
self.raw_data = remaining
else:
self.add_data(data)
self.add_data(remaining)
assert(self.complete)
def update_from_data(self):
@ -533,20 +571,22 @@ class Request(object):
else:
self._partial_data += data
def _process_host(self, hostline, overwrite=False):
# Only overwrite if told to since we may set it during the CONNECT request and we don't want to
# overwrite that
def _process_host(self, hostline):
# Get address and port
# Returns true if port was explicitly stated
port_given = False
if ':' in hostline:
self.host, self.port = hostline.split(':')
self.port = int(self.port)
if self.port == 443:
self.is_ssl = True
port_given = True
else:
self.host = hostline
if not self.port or overwrite: # could be changed by connect request
if not self.port:
self.port = 80
self.host.strip()
return port_given
def add_line(self, line):
# Add a line (for status line and headers)
@ -572,28 +612,28 @@ class Request(object):
self.headers.append(key, val, do_callback=False)
self.header_len += len(line)+2
def handle_statusline(self, status_line):
parts = status_line.split()
uri = None
if len(parts) == 3:
self.verb, uri, self.version = parts
elif len(parts) == 2:
self.verb, self.version = parts
else:
raise Exception("Unexpected format of first line of request")
# Get path using urlparse
if uri is not None:
def _handle_statusline_uri(self, uri):
if not re.match('(?:^.+)://', uri):
uri = '//' + uri
parsed_path = urlparse.urlparse(uri)
netloc = parsed_path.netloc
self._process_host(netloc)
port_given = False
if netloc:
port_given = self._process_host(netloc)
# Check for https
if re.match('^https://', uri) or self.port == 443:
self.is_ssl = True
if not port_given:
self.port = 443
if re.match('^http://', uri):
self.is_ssl = False
if not self.port:
if self.is_ssl:
self.port = 443
else:
self.port = 80
reqpath = parsed_path.path
self.path = parsed_path.path
@ -606,6 +646,20 @@ class Request(object):
reqpath += parsed_path.fragment
self.fragment = parsed_path.fragment
def handle_statusline(self, status_line):
parts = status_line.split()
uri = None
if len(parts) == 3:
self.verb, uri, self.version = parts
elif len(parts) == 2:
self.verb, self.version = parts
else:
raise Exception("Unexpected format of first line of request")
# Get path using urlparse
if uri is not None:
self._handle_statusline_uri(uri)
def handle_header(self, key, val):
# We may have duplicate headers
stripped = False
@ -795,10 +849,10 @@ class Request(object):
@defer.inlineCallbacks
def submit(host, port, is_ssl, full_request):
new_obj = Request(full_request)
factory = proxy.ProxyClientFactory(new_obj)
factory.connection_id = proxy.get_next_connection_id()
factory = pappyproxy.proxy.ProxyClientFactory(new_obj)
factory.connection_id = pappyproxy.proxy.get_next_connection_id()
if is_ssl:
reactor.connectSSL(host, port, factory, proxy.ClientTLSContext())
reactor.connectSSL(host, port, factory, pappyproxy.proxy.ClientTLSContext())
else:
reactor.connectTCP(host, port, factory)
new_req = yield factory.data_defer
@ -860,7 +914,7 @@ class Request(object):
newreq = yield Request.load_request(int(r[0]))
reqs.append(newreq)
reqs = context.filter_reqs(reqs, filters)
reqs = pappyproxy.context.filter_reqs(reqs, filters)
defer.returnValue(reqs)
@ -942,21 +996,19 @@ class Response(object):
if full_response == '':
return
# We do redundant splits, but whatever
lines = full_response.splitlines()
for line in lines:
if self.headers_complete:
break
remaining = full_response
while remaining and not self.headers_complete:
line, remaining = consume_line(remaining)
self.add_line(line)
if not self.headers_complete:
self.add_line('')
if not self.complete:
data = full_response[self.header_len:]
if update_content_length:
self.raw_data = data
self.raw_data = remaining
else:
self.add_data(data)
self.add_data(remaining)
assert(self.complete)
def add_line(self, line):

@ -1,10 +1,9 @@
import console
import context
import proxy
import os
import string
import subprocess
import tempfile
import http
import pappyproxy
from twisted.internet import defer
@ -31,7 +30,7 @@ def mangle_request(request, connection_id):
orig_req.is_ssl = request.is_ssl
retreq = orig_req
if context.in_scope(orig_req):
if pappyproxy.context.in_scope(orig_req):
if intercept_requests: # if we want to mangle...
# Write original request to the temp file
with tempfile.NamedTemporaryFile(delete=False) as tf:
@ -39,7 +38,7 @@ def mangle_request(request, connection_id):
tf.write(orig_req.full_request)
# Have the console edit the file
yield console.edit_file(tfName)
yield pappyproxy.console.edit_file(tfName)
# Create new mangled request from edited file
with open(tfName, 'r') as f:
@ -47,9 +46,11 @@ def mangle_request(request, connection_id):
mangled_req.is_ssl = orig_req.is_ssl
mangled_req.port = orig_req.port
os.remove(tfName)
# Check if dropped
if mangled_req.full_request == '':
proxy.log('Request dropped!')
pappyproxy.proxy.log('Request dropped!')
defer.returnValue(None)
# Check if it changed
@ -59,9 +60,9 @@ def mangle_request(request, connection_id):
retreq = mangled_req
# Add our request to the context
context.add_request(retreq)
pappyproxy.context.add_request(retreq)
else:
proxy.log('Out of scope! Request passed along unharmed', id=connection_id)
pappyproxy.proxy.log('Out of scope! Request passed along unharmed', id=connection_id)
active_requests[connection_id] = retreq
retreq.submitted = True
@ -79,7 +80,7 @@ def mangle_response(response, connection_id):
orig_rsp = http.Response(response.full_response)
retrsp = orig_rsp
if context.in_scope(myreq):
if pappyproxy.context.in_scope(myreq):
if intercept_responses: # If we want to mangle...
# Write original request to the temp file
with tempfile.NamedTemporaryFile(delete=False) as tf:
@ -87,15 +88,17 @@ def mangle_response(response, connection_id):
tf.write(orig_rsp.full_response)
# Have the console edit the file
yield console.edit_file(tfName)
yield pappyproxy.console.edit_file(tfName, front=True)
# Create new mangled request from edited file
with open(tfName, 'r') as f:
mangled_rsp = http.Response(f.read(), update_content_length=True)
os.remove(tfName)
# Check if dropped
if mangled_rsp.full_response == '':
proxy.log('Response dropped!')
pappyproxy.proxy.log('Response dropped!')
defer.returnValue(None)
if mangled_rsp.full_response != orig_rsp.full_response:
@ -108,10 +111,10 @@ def mangle_response(response, connection_id):
myreq.unmangled.save()
myreq.response = retrsp
else:
proxy.log('Out of scope! Response passed along unharmed', id=connection_id)
pappyproxy.proxy.log('Out of scope! Response passed along unharmed', id=connection_id)
del active_requests[connection_id]
myreq.response = retrsp
context.filter_recheck()
pappyproxy.context.filter_recheck()
defer.returnValue(myreq)
def connection_lost(connection_id):

@ -1,19 +1,21 @@
#!/usr/bin/env python2
import argparse
import cmd2
import config
import console
import comm
import context
import crochet
import http
import imp
import os
import schema.update
import proxy
import shutil
import sys
import sqlite3
import tempfile
from pappyproxy import console
from pappyproxy import config
from pappyproxy import comm
from pappyproxy import http
from pappyproxy import context
from pappyproxy import proxy
from twisted.enterprise import adbapi
from twisted.internet import reactor, defer
from twisted.internet.threads import deferToThread
@ -22,17 +24,52 @@ from twisted.internet.protocol import ServerFactory
crochet.no_setup()
def parse_args():
# parses sys.argv and returns a settings dictionary
parser = argparse.ArgumentParser(description='An intercepting proxy for testing web applications.')
parser.add_argument('-l', '--lite', help='Run the proxy in "lite" mode', action='store_true')
args = parser.parse_args(sys.argv[1:])
settings = {}
if args.lite:
settings['lite'] = True
else:
settings['lite'] = False
return settings
def set_text_factory(conn):
conn.text_factory = str
def delete_datafile():
print 'Deleting temporary datafile'
os.remove(config.DATAFILE)
@defer.inlineCallbacks
def main():
settings = parse_args()
if settings['lite']:
conf_settings = config.get_default_config()
conf_settings['debug_dir'] = None
conf_settings['debug_to_file'] = False
with tempfile.NamedTemporaryFile(delete=False) as tf:
conf_settings['data_file'] = tf.name
print 'Temporary datafile is %s' % tf.name
delete_data_on_quit = True
config.load_settings(conf_settings)
else:
# Initialize config
config.load_from_file('./config.json')
delete_data_on_quit = False
# If the data file doesn't exist, create it with restricted permissions
if not os.path.isfile(config.DATAFILE):
with os.fdopen(os.open(config.DATAFILE, os.O_CREAT, 0o0600), 'r') as f:
pass
# Set up data store
dbpool = adbapi.ConnectionPool("sqlite3", config.DATAFILE,
check_same_thread=False,
cp_openfun=set_text_factory,
@ -64,13 +101,19 @@ def main():
comm_port = reactor.listenTCP(0, com_factory, interface='127.0.0.1')
comm.set_comm_port(comm_port.getHost().port)
sys.argv = [sys.argv[0]] # cmd2 tries to parse args
d = deferToThread(console.ProxyCmd().cmdloop)
d.addCallback(lambda ignored: reactor.stop())
if delete_data_on_quit:
d.addCallback(lambda ignored: delete_datafile())
# Load the scope
yield context.load_scope(http.dbpool)
context.reset_to_scope()
if __name__ == '__main__':
def start():
reactor.callWhenRunning(main)
reactor.run()
if __name__ == '__main__':
start()

@ -1,10 +1,5 @@
import config
import console
import context
import datetime
import gzip
import mangle
import http
import os
import random
import re
@ -16,6 +11,12 @@ import sys
import urlparse
import zlib
from OpenSSL import SSL
from pappyproxy import config
from pappyproxy import console
from pappyproxy import context
from pappyproxy import http
from pappyproxy import mangle
from pappyproxy.util import PappyException
from twisted.enterprise import adbapi
from twisted.internet import reactor, ssl
from twisted.internet.protocol import ClientFactory
@ -69,6 +70,7 @@ class ProxyClient(LineReceiver):
self._response_sent = False
self._sent = False
self.request = request
self.data_defer = defer.Deferred()
self._response_obj = http.Response()
@ -121,6 +123,7 @@ class ProxyClient(LineReceiver):
if not self._sent:
self.transport.write(mangled_request.full_request)
self._sent = True
self.data_defer.callback(mangled_request.full_request)
def handle_response_end(self, *args, **kwargs):
self.log("Remote response finished, returning data to original stream")
@ -180,8 +183,7 @@ class ProxyServer(LineReceiver):
self._request_obj = http.Request()
self._connect_response = False
self._forward = True
self._port = None
self._host = None
self._connect_uri = None
def lineReceived(self, *args, **kwargs):
line = args[0]
@ -191,18 +193,17 @@ class ProxyServer(LineReceiver):
if self._request_obj.verb.upper() == 'CONNECT':
self._connect_response = True
self._forward = False
# For if we only get the port in the connect request
if self._request_obj.port is not None:
self._port = self._request_obj.port
if self._request_obj.host is not None:
self._host = self._request_obj.host
self._connect_uri = self._request_obj.url
if self._request_obj.headers_complete:
self.setRawMode()
if self._request_obj.complete:
self.setLineMode()
try:
self.full_request_received()
except PappyException as e:
print str(e)
def rawDataReceived(self, *args, **kwargs):
data = args[0]
@ -210,7 +211,10 @@ class ProxyServer(LineReceiver):
self.log(data, symbol='d>', verbosity_level=3)
if self._request_obj.complete:
try:
self.full_request_received()
except PappyException as e:
print str(e)
def full_request_received(self, *args, **kwargs):
global cached_certs
@ -256,10 +260,8 @@ class ProxyServer(LineReceiver):
self._connect_response = False
self._forward = True
self._request_obj = http.Request()
if self._port is not None:
self._request_obj.port = self._port
if self._host is not None:
self._request_obj.host = self._host
if self._connect_uri:
self._request_obj.url = self._connect_uri
self.setLineMode()
def send_response_back(self, response):
@ -303,12 +305,24 @@ def generate_cert_serial():
# Generates a random serial to be used for the cert
return random.getrandbits(8*20)
def generate_cert(hostname, cert_dir):
def load_certs_from_dir(cert_dir):
try:
with open(cert_dir+'/'+config.SSL_CA_FILE, 'rt') as f:
ca_raw = f.read()
except IOError:
raise PappyException("Could not load CA cert!")
try:
with open(cert_dir+'/'+config.SSL_PKEY_FILE, 'rt') as f:
ca_key_raw = f.read()
except IOError:
raise PappyException("Could not load CA private key!")
return (ca_raw, ca_key_raw)
def generate_cert(hostname, cert_dir):
(ca_raw, ca_key_raw) = load_certs_from_dir(cert_dir)
ca_cert = crypto.load_certificate(crypto.FILETYPE_PEM, ca_raw)
ca_key = crypto.load_privatekey(crypto.FILETYPE_PEM, ca_key_raw)

@ -1,7 +1,8 @@
import comm
import subprocess
import os
from pappyproxy import comm
def start_editor(reqid):
script_loc = os.path.join(os.path.dirname(__file__), "vim_repeater", "repeater.vim")
#print "RepeaterSetup %d %d"%(reqid, comm_port)

@ -1,4 +1,4 @@
import http
from pappyproxy import http
from twisted.internet import defer
"""

@ -34,11 +34,16 @@ def add_schema_files(schemas):
def update_schema(dbpool):
# Update the database schema to the latest version
schema_version = yield get_schema_version(dbpool)
if schema_version == 0:
verbose_update = False
else:
verbose_update = True
schemas = []
add_schema_files(schemas)
schemas = sorted(schemas, key=lambda tup: tup[0])
for i in range(schema_version, len(schemas)):
# schemas[0] is v1, schemas[1] is v2, etc
if verbose_update:
print "Updating datafaile schema to version %d" % (i+1)
yield schemas[i][1].update(dbpool)

@ -1,7 +1,7 @@
import pytest
import context
from http import Request, Response, ResponseCookie
from pappyproxy import context
from pappyproxy.http import Request, Response, ResponseCookie
@pytest.fixture
def http_request():

@ -5,7 +5,7 @@ import pytest
import StringIO
import zlib
from pappy import http
from pappyproxy.pappy import http
####################
# Helper Functions
@ -528,11 +528,15 @@ def test_request_parse_host():
def test_request_newline_delim():
r = http.Request(('GET / HTTP/1.1\n'
'Content-Length: 4\n'
'Test-Header: foo\r\n'
'Other-header: bar\n\r\n'))
'Other-header: bar\n\r\n'
'AAAA'))
assert r.full_request == ('GET / HTTP/1.1\r\n'
'Content-Length: 4\r\n'
'Test-Header: foo\r\n'
'Other-header: bar\r\n\r\n')
'Other-header: bar\r\n\r\n'
'AAAA')
def test_repeated_request_headers():
header_lines = [
@ -637,12 +641,11 @@ def test_request_to_json():
r.response = rsp
expected_reqdata = {'full_request': base64.b64encode(r.full_request),
'response_id': rsp.rspid,
'port': 80,
'is_ssl': False,
#'tag': r.tag,
'reqid': r.reqid,
expected_reqdata = {u'full_request': unicode(base64.b64encode(r.full_request)),
u'response_id': rsp.rspid,
u'port': 80,
u'is_ssl': False,
u'reqid': r.reqid,
}
assert json.loads(r.to_json()) == expected_reqdata
@ -693,6 +696,45 @@ def test_request_blank_cookies():
'Cookie: a=b; foo; c=d\r\n'))
assert r.cookies['foo'] == ''
def test_request_set_url():
r = http.Request('GET / HTTP/1.1\r\n')
r.url = 'www.AAAA.BBBB'
assert r.host == 'www.AAAA.BBBB'
assert r.port == 80
assert not r.is_ssl
r.url = 'https://www.AAAA.BBBB'
assert r.host == 'www.AAAA.BBBB'
assert r.port == 443
assert r.is_ssl
r.url = 'https://www.AAAA.BBBB:1234'
assert r.host == 'www.AAAA.BBBB'
assert r.port == 1234
assert r.is_ssl
r.url = 'http://www.AAAA.BBBB:443'
assert r.host == 'www.AAAA.BBBB'
assert r.port == 443
assert not r.is_ssl
r.url = 'www.AAAA.BBBB:443'
assert r.host == 'www.AAAA.BBBB'
assert r.port == 443
assert r.is_ssl
def test_request_set_url_params():
r = http.Request('GET / HTTP/1.1\r\n')
r.url = 'www.AAAA.BBBB?a=b&c=d#foo'
assert r.get_params.all_pairs() == [('a','b'), ('c','d')]
assert r.fragment == 'foo'
assert r.url == 'http://www.AAAA.BBBB?a=b&c=d#foo'
r.port = 400
assert r.url == 'http://www.AAAA.BBBB:400?a=b&c=d#foo'
r.is_ssl = True
assert r.url == 'https://www.AAAA.BBBB:400?a=b&c=d#foo'
####################
## Response tests
@ -1050,3 +1092,11 @@ def test_response_blank_headers():
assert r.headers['header'] == ''
assert r.headers['header2'] == ''
def test_response_newlines():
r = http.Response(('HTTP/1.1 200 OK\n'
'Content-Length: 4\n\r\n'
'AAAA'))
assert r.full_response == ('HTTP/1.1 200 OK\r\n'
'Content-Length: 4\r\n\r\n'
'AAAA')

@ -0,0 +1,220 @@
import pytest
import mock
import twisted.internet
import twisted.test
from pappyproxy import http
from pappyproxy import mangle
from pappyproxy.proxy import ProxyClient, ProxyClientFactory, ProxyServer
from testutil import mock_deferred, func_deleted, func_ignored_deferred, func_ignored, no_tcp
from twisted.internet.protocol import ServerFactory
from twisted.test.iosim import FakeTransport
from twisted.internet import defer, reactor
####################
## Fixtures
@pytest.fixture
def unconnected_proxyserver(mocker):
mocker.patch("twisted.test.iosim.FakeTransport.startTLS")
mocker.patch("pappyproxy.proxy.load_certs_from_dir", new=mock_generate_cert)
factory = ServerFactory()
factory.protocol = ProxyServer
protocol = factory.buildProtocol(('127.0.0.1', 0))
protocol.makeConnection(FakeTransport(protocol, True))
return protocol
@pytest.fixture
def proxyserver(mocker):
mocker.patch("twisted.test.iosim.FakeTransport.startTLS")
mocker.patch("pappyproxy.proxy.load_certs_from_dir", new=mock_generate_cert)
factory = ServerFactory()
factory.protocol = ProxyServer
protocol = factory.buildProtocol(('127.0.0.1', 0))
protocol.makeConnection(FakeTransport(protocol, True))
protocol.lineReceived('CONNECT https://www.AAAA.BBBB:443 HTTP/1.1')
protocol.lineReceived('')
protocol.transport.getOutBuffer()
return protocol
@pytest.fixture
def proxy_connection():
@defer.inlineCallbacks
def gen_connection(send_data):
factory = ProxyClientFactory(http.Request(send_data))
protocol = factory.buildProtocol(None)
tr = FakeTransport(protocol, True)
protocol.makeConnection(tr)
sent = yield protocol.data_defer
defer.returnValue((protocol, sent, factory.data_defer))
return gen_connection
## Autorun fixtures
# @pytest.fixture(autouse=True)
# def no_mangle(mocker):
# # Don't call anything in mangle.py
# mocker.patch("mangle.mangle_request", notouch_mangle_req)
# mocker.patch("mangle.mangle_response", notouch_mangle_rsp)
@pytest.fixture(autouse=True)
def ignore_save(mocker):
mocker.patch("pappyproxy.http.Request.deep_save", func_ignored_deferred)
####################
## Mock functions
def mock_generate_cert(cert_dir):
private_key = ('-----BEGIN PRIVATE KEY-----\n'
'MIIEvwIBADANBgkqhkiG9w0BAQEFAASCBKkwggSlAgEAAoIBAQDAoClrYUEB7lM0\n'
'zQaKkXZVG2d1Bu9hV8urpx0gNXMbyZ2m3xb+sKZju/FHPuWenA4KaN5gRUT+oLfv\n'
'tnF6Ia0jpRNWnX0Fyn/irdg1BWGJn7k7mJ2D0NXZQczn2+xxY05599NfGWqNKCYy\n'
'jhSwPsUK+sGJqi7aSDdlS97ZTjrQVTTFsC0+kSu4lS5fsWXxqrKLa6Ao8W7abVRO\n'
'JHazh/cxM4UKpgWU+E6yD4o4ZgHY+SMTVUh/IOM8DuOVyLEWtx4oLNiLMlpWT3qy\n'
'4IMpOF6VuU6JF2HGV13SoJfhsLXsPRbLVTAnZvJcZwtgDm6NfKapU8W8olkDV1Bf\n'
'YQEMSNX7AgMBAAECggEBAII0wUrAdrzjaIMsg9tu8FofKBPHGFDok9f4Iov/FUwX\n'
'QOXnrxeCOTb5d+L89SH9ws/ui0LwD+8+nJcA8DvqP6r0jtnhov0jIMcNVDSi6oeo\n'
'3AEY7ICJzcQJ4oRn+K+8vPNdPhfuikPYe9l4iSuJgpAlaGWyD/GlFyz12DFz2/Wu\n'
'NIcqR1ucvezRHn3eGMtvDv2WGaN4ifUc30k8XgSUesmwSI6beb5+hxq7wXfsurnP\n'
'EUrPY9ts3lfiAgxzTKOuj1VR5hn7cJyLN8jF0mZs4D6eSSHorIddhmaNiCq5ZbMd\n'
'QdlDiPvnXHT41OoXOb7tDEt7SGoiRh2noCZ1aZiSziECgYEA+tuPPLYWU6JRB6EW\n'
'PhbcXQbh3vML7eT1q7DOz0jYCojgT2+k7EWSI8T830oQyjbpe3Z86XEgH7UBjUgq\n'
'27nJ4E6dQDYGbYCKEklOoCGLE7A60i1feIz8otOQRrbQ4jcpibEgscA6gzHmunYf\n'
'De5euUgYW+Rq2Vmr6/NzUaUgui8CgYEAxJMDwPOGgiLM1cczlaSIU9Obz+cVnwWn\n'
'nsdKYMto2V3yKLydDfjsgOgzxHOxxy+5L645TPxK6CkiISuhJ93kAFFtx+1sCBCT\n'
'tVzY5robVAekxA9tlPIxtsn3+/axx3n6HnV0oA/XtxkuOS5JImgEdXqFwJZkerGE\n'
'waftIU2FCfUCgYEArl8+ErJzlJEIiCgWIPSdGuD00pfZW/TCPCT7rKRy3+fDHBR7\n'
'7Gxzp/9+0utV/mnrJBH5w/8JmGCmgoF+oRtk01FyBzdGgolN8GYajD6kwPvH917o\n'
'tRAzcC9lY3IigoxbiEWid0wqoBVoz4XaEkH2gA44OG/vQcQOOEYSi9cfh6sCgYBg\n'
'KLaOXdJvuIxRCzgNvMW/k+VFh3pJJx//COg2f2qT4mQCT3nYiutOh8hDEoFluc+y\n'
'Jlz7bvNJrE14wnn8IYxWJ383bMoLC+jlsDyeaW3S5kZQbmehk/SDwTrg86W1udKD\n'
'sdtSLU3N0LCO4jh+bzm3Ki9hrXALoOkbPoU+ZEhvPQKBgQDf79XQ3RNxZSk+eFyq\n'
'qD8ytVqxEoD+smPDflXXseVH6o+pNWrF8+A0KqmO8c+8KVzWj/OfULO6UbKd3E+x\n'
'4JGkWu9yF1lEgtHgibF2ER8zCSIL4ikOEasPCkrKj5SrS4Q+j4u5ha76dIc2CVu1\n'
'hkX2PQ1xU4ocu06k373sf73A4Q==\n'
'-----END PRIVATE KEY-----')
ca_key = ('-----BEGIN CERTIFICATE-----\n'
'MIIDjzCCAncCFQCjC8r+I4xa7JoGUJYGOTcqDROA0DANBgkqhkiG9w0BAQsFADBg\n'
'MQswCQYDVQQGEwJVUzERMA8GA1UECBMITWljaGlnYW4xEjAQBgNVBAcTCUFubiBB\n'
'cmJvcjEUMBIGA1UEChMLUGFwcHkgUHJveHkxFDASBgNVBAMTC1BhcHB5IFByb3h5\n'
'MB4XDTE1MTEyMDIxMTEzOVoXDTI1MTExNzIxMTEzOVowYDELMAkGA1UEBhMCVVMx\n'
'ETAPBgNVBAgTCE1pY2hpZ2FuMRIwEAYDVQQHEwlBbm4gQXJib3IxFDASBgNVBAoT\n'
'C1BhcHB5IFByb3h5MRQwEgYDVQQDEwtQYXBweSBQcm94eTCCASIwDQYJKoZIhvcN\n'
'AQEBBQADggEPADCCAQoCggEBAMCgKWthQQHuUzTNBoqRdlUbZ3UG72FXy6unHSA1\n'
'cxvJnabfFv6wpmO78Uc+5Z6cDgpo3mBFRP6gt++2cXohrSOlE1adfQXKf+Kt2DUF\n'
'YYmfuTuYnYPQ1dlBzOfb7HFjTnn3018Zao0oJjKOFLA+xQr6wYmqLtpIN2VL3tlO\n'
'OtBVNMWwLT6RK7iVLl+xZfGqsotroCjxbtptVE4kdrOH9zEzhQqmBZT4TrIPijhm\n'
'Adj5IxNVSH8g4zwO45XIsRa3Higs2IsyWlZPerLggyk4XpW5TokXYcZXXdKgl+Gw\n'
'tew9FstVMCdm8lxnC2AObo18pqlTxbyiWQNXUF9hAQxI1fsCAwEAAaNFMEMwEgYD\n'
'VR0TAQH/BAgwBgEB/wIBADAOBgNVHQ8BAf8EBAMCAQYwHQYDVR0OBBYEFNo5o+5e\n'
'a0sNMlW/75VgGJCv2AcJMA0GCSqGSIb3DQEBCwUAA4IBAQBdJDhxbmoEe27bD8me\n'
'YTcLGjs/StKkSil7rLbX+tBCwtkm5UEEejBuAhKk2FuAXW8yR1FqKJSZwVCAocBT\n'
'Bo/+97Ee+h7ywrRFhATEr9D/TbbHKOjCjDzOMl9yLZa2DKErZjbI30ZD6NafWS/X\n'
'hx5X1cGohHcVVzT4jIgUEU70vvYfNn8CTZm4oJ7qqRe/uQPUYy0rwvbd60oprtGg\n'
'jNv1H5R4ODHUMBXAI9H7ft9cWrd0fBQjxhoj8pvgJXEZ52flXSqQc7qHLg1wO/zC\n'
'RUgpTcNAb2qCssBKbj+c1vKEPRUJfw6UYb0s1462rQNc8BgZiKaNbwokFmkAnjUg\n'
'AvnX\n'
'-----END CERTIFICATE-----')
return (ca_key, private_key)
def notouch_mangle_req(request, conn_id):
orig_req = http.Request(request.full_request)
orig_req.port = request.port
orig_req.is_ssl = request.is_ssl
d = mock_deferred(orig_req)
return d
def notouch_mangle_rsp(response, conn_id):
req = http.Request()
orig_rsp = http.Response(response.full_response)
req.response = orig_rsp
d = mock_deferred(req)
return d
def req_mangler_change(request, conn_id):
req = http.Request('GET /mangled HTTP/1.1\r\n\r\n')
d = mock_deferred(req)
return d
def rsp_mangler_change(request, conn_id):
req = http.Request()
rsp = http.Response('HTTP/1.1 500 MANGLED\r\n\r\n')
req.response = rsp
d = mock_deferred(req)
return d
####################
## Unit test tests
def test_proxy_server_fixture(unconnected_proxyserver):
unconnected_proxyserver.transport.write('hello')
assert unconnected_proxyserver.transport.getOutBuffer() == 'hello'
@pytest.inlineCallbacks
def test_mock_deferreds():
d = mock_deferred('Hello!')
r = yield d
assert r == 'Hello!'
def test_deleted():
with pytest.raises(NotImplementedError):
reactor.connectTCP("www.google.com", "80", ServerFactory)
with pytest.raises(NotImplementedError):
reactor.connectSSL("www.google.com", "80", ServerFactory)
####################
## Proxy Server Tests
def test_proxy_server_connect(unconnected_proxyserver, mocker):
mocker.patch("twisted.internet.reactor.connectSSL")
unconnected_proxyserver.lineReceived('CONNECT https://www.dddddd.fff:433 HTTP/1.1')
unconnected_proxyserver.lineReceived('')
assert unconnected_proxyserver.transport.getOutBuffer() == 'HTTP/1.1 200 Connection established\r\n\r\n'
assert unconnected_proxyserver._request_obj.is_ssl
def test_proxy_server_basic(proxyserver, mocker):
mocker.patch("twisted.internet.reactor.connectSSL")
mocker.patch('pappyproxy.proxy.ProxyServer.setRawMode')
proxyserver.lineReceived('GET / HTTP/1.1')
proxyserver.lineReceived('')
assert proxyserver.setRawMode.called
args, kwargs = twisted.internet.reactor.connectSSL.call_args
assert args[0] == 'www.AAAA.BBBB'
assert args[1] == 443
@pytest.inlineCallbacks
def test_proxy_client_basic(mocker, proxy_connection):
mocker.patch('pappyproxy.mangle.mangle_request', new=notouch_mangle_req)
mocker.patch('pappyproxy.mangle.mangle_response', new=notouch_mangle_rsp)
# Make the connection
(prot, sent, resp_deferred) = yield proxy_connection('GET / HTTP/1.1\r\n\r\n')
assert sent == 'GET / HTTP/1.1\r\n\r\n'
prot.lineReceived('HTTP/1.1 200 OK')
prot.lineReceived('Content-Length: 0')
prot.lineReceived('')
ret_req = yield resp_deferred
response = ret_req.response.full_response
assert response == 'HTTP/1.1 200 OK\r\nContent-Length: 0\r\n\r\n'
@pytest.inlineCallbacks
def test_proxy_client_mangle_req(mocker, proxy_connection):
mocker.patch('pappyproxy.mangle.mangle_request', new=req_mangler_change)
mocker.patch('pappyproxy.mangle.mangle_response', new=notouch_mangle_rsp)
# Make the connection
(prot, sent, resp_deferred) = yield proxy_connection('GET / HTTP/1.1\r\n\r\n')
assert sent == 'GET /mangled HTTP/1.1\r\n\r\n'
@pytest.inlineCallbacks
def test_proxy_client_basic(mocker, proxy_connection):
mocker.patch('pappyproxy.mangle.mangle_request', new=notouch_mangle_req)
mocker.patch('pappyproxy.mangle.mangle_response', new=rsp_mangler_change)
# Make the connection
(prot, sent, resp_deferred) = yield proxy_connection('GET / HTTP/1.1\r\n\r\n')
prot.lineReceived('HTTP/1.1 200 OK')
prot.lineReceived('Content-Length: 0')
prot.lineReceived('')
ret_req = yield resp_deferred
response = ret_req.response.full_response
assert response == 'HTTP/1.1 500 MANGLED\r\n\r\n'

@ -0,0 +1,42 @@
import pytest
from twisted.internet import defer
class ClassDeleted():
pass
def func_deleted(*args, **kwargs):
raise NotImplementedError()
def func_ignored(*args, **kwargs):
pass
def func_ignored_deferred(*args, **kwargs):
return mock_deferred(None)
def mock_deferred(value):
# Generates a function that can be used to make a deferred that can be used
# to mock out deferred-returning responses
def g(data):
return value
d = defer.Deferred()
d.addCallback(g)
d.callback(None)
return d
@pytest.fixture(autouse=True)
def no_tcp(mocker):
# Don't make tcp connections
mocker.patch("twisted.internet.reactor.connectTCP", new=func_deleted)
mocker.patch("twisted.internet.reactor.connectSSL", new=func_deleted)
@pytest.fixture
def ignore_tcp(mocker):
# Don't make tcp connections
mocker.patch("twisted.internet.reactor.connectTCP", new=func_ignored)
mocker.patch("twisted.internet.reactor.connectSSL", new=func_ignored)
@pytest.fixture(autouse=True)
def no_database(mocker):
# Don't make database queries
mocker.patch("twisted.enterprise.adbapi.ConnectionPool",
new=ClassDeleted)

@ -0,0 +1,2 @@
[metadata]
description-file = README.md

@ -2,21 +2,39 @@
from distutils.core import setup
setup(name='Pappy',
setup(name='pappyproxy',
version='0.0.1',
description='The Pappy Intercepting Proxy',
author='Rob Glew',
author_email='rglew56@gmail.com',
url='https://www.github.com/roglew/pappy-proxy',
packages=['pappy-proxy'],
packages=['pappyproxy'],
license='MIT',
entry_points = {
'console_scripts':['pappy = pappyproxy.pappy:start'],
},
long_description=open('README.md').read(),
keywords='http proxy hacking 1337hax pwnurmum',
install_requires=[
'twisted',
'crochet',
'cmd2',
'service_identity',
'pytest',
'pytest-cov',
'pytest-twisted',
'cmd2>=0.6.8',
'crochet>=1.4.0',
'pygments>=2.0.2',
'pytest-cov>=2.2.0',
'pytest-mock>=0.9.0',
'pytest-twisted>=1.5',
'pytest>=2.8.3',
'service_identity>=14.0.0',
'twisted>=15.4.0',
],
classifiers=[
'Intended Audience :: Developers',
'Operating System :: MacOS',
'Operating System :: POSIX :: Linux',
'Development Status :: 2 - Pre-Alpha',
'Programming Language :: Python :: 2.6',
'Programming Language :: Python :: 2.7',
'License :: OSI Approved :: MIT License',
'Topic :: Security',
'Topic :: Security :: Pwning Ur Mum',
]
)
)

Loading…
Cancel
Save