Skip to content

Commit 68c37ce

Browse files
authored
Merge pull request #31 from bear/cleanup-README
Cleanup errors in the README for formatting and other markdown glitches
2 parents 4286aed + aa4b7a8 commit 68c37ce

File tree

2 files changed

+28
-26
lines changed

2 files changed

+28
-26
lines changed

README.md

+23-21
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@ Python package to help with parsing, handling and other manipulations of the Ind
1313

1414
See the examples/ directory for sample command line tools.
1515

16-
Because Ronkyuu uses BeautifulSoup4 for it's amazing HTML wrangling ability, you have the option of enabling faster parsing via the `lxml` package instead of the default `html5lib` package. This is done by having `lxml` installed and...
16+
Because Ronkyuu uses [BeautifulSoup4](https://pypi.org/project/beautifulsoup4/) for it's amazing HTML wrangling ability, you have the option of enabling faster parsing via the [`lxml`](https://pypi.org/project/lxml/) package instead of the default [`html5lib`](https://pypi.org/project/html5lib/) package.
1717

1818
```
1919
import ronkyuu
@@ -28,35 +28,35 @@ Contributors
2828

2929
WebMentions
3030
===========
31-
findMentions()
32-
--------------
33-
Find all <a /> elements in the html returned for a post.
34-
If any have an href attribute that is not from the one of the items in domains, append it to our lists.
31+
findMentions(sourceURL, targetURL, ...)
32+
---------------------------------------
33+
Find all `<a />` elements in the html returned for a post.
34+
If any have an `href` attribute that is not from the one of the items in domains, append it to our lists.
3535

36-
findEndpoint()
37-
--------------
38-
Search the given html content for all <link /> elements and return any discovered WebMention URL.
39-
40-
discoverEndpoint()
36+
findEndpoint(html)
4137
------------------
42-
Discover any WebMention endpoint for a given URL.
38+
Search the given `html` content for all `<link />` elements and return any discovered WebMention URL.
39+
40+
discoverEndpoint(sourceURL, ...)
41+
--------------------------------
42+
Discover any WebMention endpoint for a given `url`.
4343

4444
sendWebmention(sourceURL, targetURL, webmention=None)
4545
-----------------------------------------------------
46-
Send to the targetURL a WebMention for the sourceURL.
47-
The WebMention will be discovered if not given in the optional webmention parameter.
46+
Send to the `targetURL` a WebMention for the `sourceURL`.
47+
The WebMention will be discovered if it is not given in the optional `webmention` parameter.
4848

4949
RelMe
5050
=====
51-
findRelMe()
52-
-----------
53-
Find all <a /> elements in the given html for a post.
54-
If any have an href attribute that is rel="me" then include it in the result.
51+
findRelMe(sourceURL)
52+
--------------------
53+
Find all `<a />` elements in the given html for a post.
54+
If any have an href attribute that is `rel="me"` then include it in the result.
5555

56-
confirmRelMe()
57-
--------------
56+
confirmRelMe(profileURL, resourceURL, profileRelMes, resourceRelMes)
57+
--------------------------------------------------------------------
5858
Determine if a given resourceURL is authoritative for the profileURL.
59-
The list of rel="me" links will be discovered if not provided in the optional profileRelMes parameter or the resourceRelMes paramter.
59+
The list of `rel="me"` links will be discovered if not provided in the optional profileRelMes parameter or the resourceRelMes paramter.
6060

6161
Validators
6262
==========
@@ -66,6 +66,8 @@ TODO: fill in details of how to use
6666

6767
Requires
6868
========
69-
Python v3.7+ but see `Pipfile` for a full list. The `Makefile` takes advantage of `Pipenv` (which will use `pyenv` if installed) to manage the Python dependencies.
69+
Python v3.9+ -- see `Pipfile` for the full list
70+
71+
The `Makefile` takes advantage of `Pipenv` (which will use `pyenv` if installed) to manage the Python dependencies.
7072

7173
For testing we use [httmock](https://pypi.python.org/pypi/httmock/) to mock the web calls.

src/ronkyuu/webmention.py

+5-5
Original file line numberDiff line numberDiff line change
@@ -135,7 +135,7 @@ def findEndpoint(html):
135135
return None
136136

137137

138-
def discoverEndpoint(url, test_urls=True, headers=None, timeout=None, request=None, debug=False):
138+
def discoverEndpoint(sourceURL, test_urls=True, headers=None, timeout=None, request=None, debug=False):
139139
"""Discover any WebMention endpoint for a given URL.
140140
141141
:param link: URL to discover WebMention endpoint
@@ -150,7 +150,7 @@ def discoverEndpoint(url, test_urls=True, headers=None, timeout=None, request=No
150150
if headers is None:
151151
headers = {}
152152
if test_urls:
153-
URLValidator(message='invalid URL')(url)
153+
URLValidator(message='invalid URL')(sourceURL)
154154

155155
# status, webmention
156156
endpointURL = None
@@ -159,9 +159,9 @@ def discoverEndpoint(url, test_urls=True, headers=None, timeout=None, request=No
159159
if request is not None:
160160
targetRequest = request
161161
else:
162-
targetRequest = requests.get(url, verify=False, headers=headers, timeout=timeout)
162+
targetRequest = requests.get(sourceURL, verify=False, headers=headers, timeout=timeout)
163163
returnCode = targetRequest.status_code
164-
debugOutput.append('%s %s' % (returnCode, url))
164+
debugOutput.append('%s %s' % (returnCode, sourceURL))
165165
if returnCode == requests.codes.ok: # pylint: disable=no-member
166166
try:
167167
linkHeader = parse_link_header(targetRequest.headers['link'])
@@ -179,7 +179,7 @@ def discoverEndpoint(url, test_urls=True, headers=None, timeout=None, request=No
179179
if endpointURL:
180180
debugOutput.append('found in body')
181181
if endpointURL is not None:
182-
endpointURL = urljoin(url, endpointURL)
182+
endpointURL = urljoin(sourceURL, endpointURL)
183183
except (requests.exceptions.RequestException, requests.exceptions.ConnectionError,
184184
requests.exceptions.HTTPError, requests.exceptions.URLRequired,
185185
requests.exceptions.TooManyRedirects, requests.exceptions.Timeout) as error:

0 commit comments

Comments
 (0)