forked from github.com/blag
Compare commits
83 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
73672598e5 | ||
|
|
fc4eb0f463 | ||
|
|
a224840b28 | ||
|
|
0399961aa3 | ||
|
|
e2568d715e | ||
|
|
01507e9de6 | ||
|
|
87d619cc1c | ||
|
|
7b6b219cdf | ||
|
|
875fd85d65 | ||
|
|
f59a648779 | ||
|
|
ebac0a8fc4 | ||
|
|
2adc7b3bd4 | ||
|
|
10f84ebb16 | ||
|
|
451fb1b260 | ||
|
|
322154041a | ||
|
|
f8cd915ac2 | ||
|
|
ad0ab1a0fe | ||
|
|
641f0ed94e | ||
|
|
6e94a0c094 | ||
|
|
7587fa2cad | ||
|
|
91cd948c4b | ||
|
|
bb3101ad77 | ||
|
|
9397e4c287 | ||
|
|
d1de9692ea | ||
|
|
c5ad4757e7 | ||
|
|
9e07e2e100 | ||
|
|
10bac5531f | ||
|
|
6d82d2ab79 | ||
|
|
595356e915 | ||
|
|
138a78357a | ||
|
|
6005369108 | ||
|
|
db4e03afde | ||
|
|
877c47c391 | ||
|
|
3bd7125873 | ||
|
|
35f6ef05b6 | ||
|
|
7e8f2a5b9a | ||
|
|
d942bf150c | ||
|
|
48cfb49acb | ||
|
|
c3edbeb511 | ||
|
|
a60887e0d6 | ||
|
|
bfbedcc3df | ||
|
|
dff60d7399 | ||
|
|
f9b6afa80a | ||
|
|
1e74596101 | ||
|
|
c469b9b591 | ||
|
|
d76a0834e3 | ||
|
|
20b1e281a1 | ||
|
|
58e74f8d55 | ||
|
|
aad5f288af | ||
|
|
dc76295203 | ||
|
|
8a9a8cd2eb | ||
|
|
55e82393b6 | ||
|
|
b74cea8296 | ||
|
|
58a164899c | ||
|
|
769dcca83a | ||
|
|
f8bcaafc30 | ||
|
|
f1fe211ac6 | ||
|
|
c88628350f | ||
|
|
60e8b98232 | ||
|
|
b74c34839f | ||
|
|
2a8f93147f | ||
|
|
4c12ef738c | ||
|
|
bc71f51443 | ||
|
|
4cfbdd5108 | ||
|
|
6a07b19eda | ||
|
|
d486b7a90b | ||
|
|
2355799aaa | ||
|
|
788c07446d | ||
|
|
c1375a1478 | ||
|
|
67d9a31256 | ||
|
|
25c6a4c089 | ||
|
|
e75fd4eacb | ||
|
|
88905db579 | ||
|
|
702bc2e986 | ||
|
|
7cc4d8be45 | ||
|
|
764317aa24 | ||
|
|
a8a976403f | ||
|
|
fd9a6e6fa2 | ||
|
|
8f02c107e2 | ||
|
|
489e546173 | ||
|
|
9143a4dc7f | ||
|
|
512c12eaae | ||
|
|
ab3eaf934d |
2
.github/dependabot.yml
vendored
2
.github/dependabot.yml
vendored
@@ -3,4 +3,4 @@ updates:
|
||||
- package-ecosystem: "pip"
|
||||
directory: "/"
|
||||
schedule:
|
||||
interval: "daily"
|
||||
interval: "weekly"
|
||||
|
||||
8
.github/workflows/python-package.yaml
vendored
8
.github/workflows/python-package.yaml
vendored
@@ -17,8 +17,8 @@ jobs:
|
||||
- macos-latest
|
||||
- windows-latest
|
||||
python-version:
|
||||
- 3.8
|
||||
- 3.9
|
||||
- "3.8"
|
||||
- "3.9"
|
||||
- "3.10"
|
||||
|
||||
steps:
|
||||
@@ -36,3 +36,7 @@ jobs:
|
||||
- name: Run linter
|
||||
run: |
|
||||
make lint
|
||||
|
||||
- name: Run mypy
|
||||
run: |
|
||||
make mypy
|
||||
|
||||
4
.gitignore
vendored
4
.gitignore
vendored
@@ -5,7 +5,11 @@ build/
|
||||
dist/
|
||||
*.egg-info/
|
||||
|
||||
docs/_build/
|
||||
docs/api/
|
||||
|
||||
htmlcov/
|
||||
.coverage
|
||||
.mypy_cache
|
||||
|
||||
venv/
|
||||
|
||||
38
CHANGELOG.md
38
CHANGELOG.md
@@ -1,5 +1,43 @@
|
||||
# Changelog
|
||||
|
||||
## [1.4.0] - 2022-09-01
|
||||
|
||||
* added type hints and mypy --strict to test suite
|
||||
* improved default template
|
||||
* updated dependencies:
|
||||
* markdown 3.4.1
|
||||
* pygments 2.13.0
|
||||
* flake 5.0.4
|
||||
* twine 4.0.1
|
||||
* sphinx 5.1.1
|
||||
|
||||
## [1.3.2] - 2022-06-29
|
||||
|
||||
* Added --version option
|
||||
* added --verbose option, that increases the loglevel to 'debug'
|
||||
* Improved quickstart:
|
||||
* respective default answers will be written to config if user provided no
|
||||
answer
|
||||
* added tests for quickstart
|
||||
* Added some test cases for the MarkdownLinktreeProcessor
|
||||
|
||||
## [1.3.1] - 2022-06-10
|
||||
|
||||
* fixed man page
|
||||
|
||||
## [1.3.0] - 2022-06-09
|
||||
|
||||
* debianized package
|
||||
* Small fix in makefile
|
||||
* updated dependencies:
|
||||
* pytest 7.1.2
|
||||
* sphinx 5.0.0
|
||||
* twine 3.7.1
|
||||
* wheel 0.37.1
|
||||
* markdown 3.3.7
|
||||
* jinja 3.1.2
|
||||
* pygments 2.12.0
|
||||
|
||||
## [1.2.0] - 2021-11-06
|
||||
|
||||
* `make serve` now rebuilds immediately once after called to avoid serving
|
||||
|
||||
9
Makefile
9
Makefile
@@ -13,7 +13,8 @@ ifeq ($(OS), Windows_NT)
|
||||
endif
|
||||
|
||||
|
||||
all: lint test
|
||||
.PHONY: all
|
||||
all: lint mypy test
|
||||
|
||||
$(VENV): requirements.txt requirements-dev.txt setup.py
|
||||
$(PY) -m venv $(VENV)
|
||||
@@ -26,6 +27,10 @@ $(VENV): requirements.txt requirements-dev.txt setup.py
|
||||
test: $(VENV)
|
||||
$(BIN)/pytest
|
||||
|
||||
.PHONY: mypy
|
||||
mypy: $(VENV)
|
||||
$(BIN)/mypy
|
||||
|
||||
.PHONY: lint
|
||||
lint: $(VENV)
|
||||
$(BIN)/flake8
|
||||
@@ -45,7 +50,9 @@ clean:
|
||||
rm -rf build dist *.egg-info
|
||||
rm -rf $(VENV)
|
||||
rm -rf $(DOCS_OUT)
|
||||
rm -rf $(DOCS_SRC)/api
|
||||
find . -type f -name *.pyc -delete
|
||||
find . -type d -name __pycache__ -delete
|
||||
# coverage
|
||||
rm -rf htmlcov .coverage
|
||||
rm -rf .mypy_cache
|
||||
|
||||
17
README.md
17
README.md
@@ -3,7 +3,7 @@
|
||||
blag is a blog-aware, static site generator, written in [Python][].
|
||||
|
||||
* an example "deployment" can be found [here][venthur.de]
|
||||
* online [documentation][] is available on readthedocs
|
||||
* online [documentation][] is available on https://readthedocs.org.
|
||||
|
||||
blag is named after [the blag of the webcomic xkcd][blagxkcd].
|
||||
|
||||
@@ -30,6 +30,21 @@ blag runs on Linux, Mac and Windows and requires Python >= 3.8
|
||||
[pypi]: https://pypi.org/project/blag/
|
||||
|
||||
|
||||
## Install
|
||||
|
||||
blag is available on [PyPI][], you can install it via:
|
||||
|
||||
```bash
|
||||
$ pip install blag
|
||||
```
|
||||
|
||||
On Debian or Ubuntu, you can also just install the Debian package:
|
||||
|
||||
```bash
|
||||
$ sudo aptitude install blag
|
||||
```
|
||||
|
||||
|
||||
## Quickstart
|
||||
|
||||
```bash
|
||||
|
||||
@@ -1 +1 @@
|
||||
from blag.version import __VERSION__ # noqa
|
||||
from blag.version import __VERSION__ as __VERSION__ # noqa
|
||||
|
||||
276
blag/blag.py
276
blag/blag.py
@@ -4,6 +4,9 @@
|
||||
|
||||
"""
|
||||
|
||||
# remove when we don't support py38 anymore
|
||||
from __future__ import annotations
|
||||
from typing import Any
|
||||
import argparse
|
||||
import os
|
||||
import shutil
|
||||
@@ -11,20 +14,28 @@ import logging
|
||||
import configparser
|
||||
import sys
|
||||
|
||||
from jinja2 import Environment, ChoiceLoader, FileSystemLoader, PackageLoader
|
||||
from jinja2 import (
|
||||
Environment,
|
||||
ChoiceLoader,
|
||||
FileSystemLoader,
|
||||
PackageLoader,
|
||||
Template,
|
||||
)
|
||||
import feedgenerator
|
||||
|
||||
from blag.markdown import markdown_factory, convert_markdown
|
||||
from blag.devserver import serve
|
||||
from blag.version import __VERSION__
|
||||
from blag.quickstart import quickstart
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
logging.basicConfig(
|
||||
level=logging.INFO,
|
||||
format='%(asctime)s %(levelname)s %(name)s %(message)s',
|
||||
level=logging.INFO,
|
||||
format='%(asctime)s %(levelname)s %(name)s %(message)s',
|
||||
)
|
||||
|
||||
|
||||
def main(args=None):
|
||||
def main(arguments: list[str] | None = None) -> None:
|
||||
"""Main entrypoint for the CLI.
|
||||
|
||||
This method parses the CLI arguments and executes the respective
|
||||
@@ -32,20 +43,24 @@ def main(args=None):
|
||||
|
||||
Parameters
|
||||
----------
|
||||
args : list[str]
|
||||
arguments
|
||||
optional parameters, used for testing
|
||||
|
||||
"""
|
||||
args = parse_args(args)
|
||||
args = parse_args(arguments)
|
||||
# set loglevel
|
||||
if args.verbose:
|
||||
logger.setLevel(logging.DEBUG)
|
||||
logger.debug(f"This is blag {__VERSION__}.")
|
||||
args.func(args)
|
||||
|
||||
|
||||
def parse_args(args=None):
|
||||
def parse_args(args: list[str] | None = None) -> argparse.Namespace:
|
||||
"""Parse command line arguments.
|
||||
|
||||
Parameters
|
||||
----------
|
||||
args : List[str]
|
||||
args
|
||||
optional parameters, used for testing
|
||||
|
||||
Returns
|
||||
@@ -54,83 +69,102 @@ def parse_args(args=None):
|
||||
|
||||
"""
|
||||
parser = argparse.ArgumentParser()
|
||||
parser.add_argument(
|
||||
'--version',
|
||||
action='version',
|
||||
version='%(prog)s ' + __VERSION__,
|
||||
)
|
||||
parser.add_argument(
|
||||
'-v',
|
||||
'--verbose',
|
||||
action='store_true',
|
||||
help='Verbose output.',
|
||||
)
|
||||
|
||||
commands = parser.add_subparsers(dest='command')
|
||||
commands.required = True
|
||||
|
||||
build_parser = commands.add_parser(
|
||||
'build',
|
||||
help='Build website.',
|
||||
'build',
|
||||
help='Build website.',
|
||||
)
|
||||
build_parser.set_defaults(func=build)
|
||||
build_parser.add_argument(
|
||||
'-i', '--input-dir',
|
||||
default='content',
|
||||
help='Input directory (default: content)',
|
||||
'-i',
|
||||
'--input-dir',
|
||||
default='content',
|
||||
help='Input directory (default: content)',
|
||||
)
|
||||
build_parser.add_argument(
|
||||
'-o', '--output-dir',
|
||||
default='build',
|
||||
help='Ouptut directory (default: build)',
|
||||
'-o',
|
||||
'--output-dir',
|
||||
default='build',
|
||||
help='Ouptut directory (default: build)',
|
||||
)
|
||||
build_parser.add_argument(
|
||||
'-t', '--template-dir',
|
||||
default='templates',
|
||||
help='Template directory (default: templates)',
|
||||
'-t',
|
||||
'--template-dir',
|
||||
default='templates',
|
||||
help='Template directory (default: templates)',
|
||||
)
|
||||
build_parser.add_argument(
|
||||
'-s', '--static-dir',
|
||||
default='static',
|
||||
help='Static directory (default: static)',
|
||||
'-s',
|
||||
'--static-dir',
|
||||
default='static',
|
||||
help='Static directory (default: static)',
|
||||
)
|
||||
|
||||
quickstart_parser = commands.add_parser(
|
||||
'quickstart',
|
||||
help="Quickstart blag, creating necessary configuration.",
|
||||
'quickstart',
|
||||
help="Quickstart blag, creating necessary configuration.",
|
||||
)
|
||||
quickstart_parser.set_defaults(func=quickstart)
|
||||
|
||||
serve_parser = commands.add_parser(
|
||||
'serve',
|
||||
help="Start development server.",
|
||||
'serve',
|
||||
help="Start development server.",
|
||||
)
|
||||
serve_parser.set_defaults(func=serve)
|
||||
serve_parser.add_argument(
|
||||
'-i', '--input-dir',
|
||||
default='content',
|
||||
help='Input directory (default: content)',
|
||||
'-i',
|
||||
'--input-dir',
|
||||
default='content',
|
||||
help='Input directory (default: content)',
|
||||
)
|
||||
serve_parser.add_argument(
|
||||
'-o', '--output-dir',
|
||||
default='build',
|
||||
help='Ouptut directory (default: build)',
|
||||
'-o',
|
||||
'--output-dir',
|
||||
default='build',
|
||||
help='Ouptut directory (default: build)',
|
||||
)
|
||||
serve_parser.add_argument(
|
||||
'-t', '--template-dir',
|
||||
default='templates',
|
||||
help='Template directory (default: templates)',
|
||||
'-t',
|
||||
'--template-dir',
|
||||
default='templates',
|
||||
help='Template directory (default: templates)',
|
||||
)
|
||||
serve_parser.add_argument(
|
||||
'-s', '--static-dir',
|
||||
default='static',
|
||||
help='Static directory (default: static)',
|
||||
'-s',
|
||||
'--static-dir',
|
||||
default='static',
|
||||
help='Static directory (default: static)',
|
||||
)
|
||||
|
||||
return parser.parse_args(args)
|
||||
|
||||
|
||||
def get_config(configfile):
|
||||
def get_config(configfile: str) -> configparser.SectionProxy:
|
||||
"""Load site configuration from configfile.
|
||||
|
||||
Parameters
|
||||
----------
|
||||
configfile : str
|
||||
configfile
|
||||
path to configuration file
|
||||
|
||||
|
||||
Returns
|
||||
-------
|
||||
dict
|
||||
configparser.SectionProxy
|
||||
|
||||
"""
|
||||
config = configparser.ConfigParser()
|
||||
@@ -150,7 +184,10 @@ def get_config(configfile):
|
||||
return config['main']
|
||||
|
||||
|
||||
def environment_factory(template_dir=None, globals_=None):
|
||||
def environment_factory(
|
||||
template_dir: str | None = None,
|
||||
globals_: dict[str, object] | None = None,
|
||||
) -> Environment:
|
||||
"""Environment factory.
|
||||
|
||||
Creates a Jinja2 Environment with the default templates and
|
||||
@@ -160,8 +197,9 @@ def environment_factory(template_dir=None, globals_=None):
|
||||
|
||||
Parameters
|
||||
----------
|
||||
template_dir : str
|
||||
globals_ : dict
|
||||
template_dir
|
||||
directory containing the templates
|
||||
globals_
|
||||
|
||||
Returns
|
||||
-------
|
||||
@@ -170,7 +208,7 @@ def environment_factory(template_dir=None, globals_=None):
|
||||
"""
|
||||
# first we try the custom templates, and fall back the ones provided
|
||||
# by blag
|
||||
loaders = []
|
||||
loaders: list[FileSystemLoader | PackageLoader] = []
|
||||
if template_dir:
|
||||
loaders.append(FileSystemLoader([template_dir]))
|
||||
loaders.append(PackageLoader('blag', 'templates'))
|
||||
@@ -180,7 +218,7 @@ def environment_factory(template_dir=None, globals_=None):
|
||||
return env
|
||||
|
||||
|
||||
def build(args):
|
||||
def build(args: argparse.Namespace) -> None:
|
||||
"""Build the site.
|
||||
|
||||
This is blag's main method that builds the site, generates the feed
|
||||
@@ -188,15 +226,16 @@ def build(args):
|
||||
|
||||
Parameters
|
||||
----------
|
||||
args : argparse.Namespace
|
||||
args
|
||||
|
||||
"""
|
||||
os.makedirs(f'{args.output_dir}', exist_ok=True)
|
||||
convertibles = []
|
||||
for root, dirnames, filenames in os.walk(args.input_dir):
|
||||
for filename in filenames:
|
||||
rel_src = os.path.relpath(f'{root}/{filename}',
|
||||
start=args.input_dir)
|
||||
rel_src = os.path.relpath(
|
||||
f'{root}/{filename}', start=args.input_dir
|
||||
)
|
||||
# all non-markdown files are just copied over, the markdown
|
||||
# files are converted to html
|
||||
if rel_src.endswith('.md'):
|
||||
@@ -204,14 +243,17 @@ def build(args):
|
||||
rel_dst = rel_dst[:-3] + '.html'
|
||||
convertibles.append((rel_src, rel_dst))
|
||||
else:
|
||||
shutil.copy(f'{args.input_dir}/{rel_src}',
|
||||
f'{args.output_dir}/{rel_src}')
|
||||
shutil.copy(
|
||||
f'{args.input_dir}/{rel_src}',
|
||||
f'{args.output_dir}/{rel_src}',
|
||||
)
|
||||
for dirname in dirnames:
|
||||
# all directories are copied into the output directory
|
||||
path = os.path.relpath(f'{root}/{dirname}', start=args.input_dir)
|
||||
os.makedirs(f'{args.output_dir}/{path}', exist_ok=True)
|
||||
|
||||
# copy static files over
|
||||
logger.info('Copying static files.')
|
||||
if os.path.exists(args.static_dir):
|
||||
shutil.copytree(args.static_dir, args.output_dir, dirs_exist_ok=True)
|
||||
|
||||
@@ -234,7 +276,8 @@ def build(args):
|
||||
)
|
||||
|
||||
generate_feed(
|
||||
articles, args.output_dir,
|
||||
articles,
|
||||
args.output_dir,
|
||||
base_url=config['base_url'],
|
||||
blog_title=config['title'],
|
||||
blog_description=config['description'],
|
||||
@@ -244,8 +287,13 @@ def build(args):
|
||||
generate_tags(articles, tags_template, tag_template, args.output_dir)
|
||||
|
||||
|
||||
def process_markdown(convertibles, input_dir, output_dir,
|
||||
page_template, article_template):
|
||||
def process_markdown(
|
||||
convertibles: list[tuple[str, str]],
|
||||
input_dir: str,
|
||||
output_dir: str,
|
||||
page_template: Template,
|
||||
article_template: Template,
|
||||
) -> tuple[list[tuple[str, dict[str, Any]]], list[tuple[str, dict[str, Any]]]]:
|
||||
"""Process markdown files.
|
||||
|
||||
This method processes the convertibles, converts them to html and
|
||||
@@ -256,16 +304,17 @@ def process_markdown(convertibles, input_dir, output_dir,
|
||||
|
||||
Parameters
|
||||
----------
|
||||
convertibles : List[Tuple[str, str]]
|
||||
convertibles
|
||||
relative paths to markdown- (src) html- (dest) files
|
||||
input_dir : str
|
||||
output_dir : str
|
||||
page_template, archive_template : jinja2 template
|
||||
input_dir
|
||||
output_dir
|
||||
page_template, archive_template
|
||||
templats for pages and articles
|
||||
|
||||
Returns
|
||||
-------
|
||||
articles, pages : List[Tuple[str, Dict]]
|
||||
list[tuple[str, dict[str, Any]]], list[tuple[str, dict[str, Any]]]
|
||||
articles and pages
|
||||
|
||||
"""
|
||||
logger.info("Converting Markdown files...")
|
||||
@@ -274,7 +323,8 @@ def process_markdown(convertibles, input_dir, output_dir,
|
||||
articles = []
|
||||
pages = []
|
||||
for src, dst in convertibles:
|
||||
logger.info(f'Processing {src}')
|
||||
logger.debug(f'Processing {src}')
|
||||
|
||||
with open(f'{input_dir}/{src}', 'r') as fh:
|
||||
body = fh.read()
|
||||
|
||||
@@ -300,37 +350,37 @@ def process_markdown(convertibles, input_dir, output_dir,
|
||||
|
||||
|
||||
def generate_feed(
|
||||
articles,
|
||||
output_dir,
|
||||
base_url,
|
||||
blog_title,
|
||||
blog_description,
|
||||
blog_author,
|
||||
):
|
||||
articles: list[tuple[str, dict[str, Any]]],
|
||||
output_dir: str,
|
||||
base_url: str,
|
||||
blog_title: str,
|
||||
blog_description: str,
|
||||
blog_author: str,
|
||||
) -> None:
|
||||
"""Generate Atom feed.
|
||||
|
||||
Parameters
|
||||
----------
|
||||
articles : list[list[str, dict]]
|
||||
articles
|
||||
list of relative output path and article dictionary
|
||||
output_dir : str
|
||||
output_dir
|
||||
where the feed is stored
|
||||
base_url : str
|
||||
base_url
|
||||
base url
|
||||
blog_title : str
|
||||
blog_title
|
||||
blog title
|
||||
blog_description : str
|
||||
blog_description
|
||||
blog description
|
||||
blog_author : str
|
||||
blog_author
|
||||
blog author
|
||||
|
||||
"""
|
||||
logger.info('Generating Atom feed.')
|
||||
feed = feedgenerator.Atom1Feed(
|
||||
link=base_url,
|
||||
title=blog_title,
|
||||
description=blog_description,
|
||||
feed_url=base_url + 'atom.xml',
|
||||
link=base_url,
|
||||
title=blog_title,
|
||||
description=blog_description,
|
||||
feed_url=base_url + 'atom.xml',
|
||||
)
|
||||
|
||||
for dst, context in articles:
|
||||
@@ -351,16 +401,20 @@ def generate_feed(
|
||||
feed.write(fh, encoding='utf8')
|
||||
|
||||
|
||||
def generate_archive(articles, template, output_dir):
|
||||
def generate_archive(
|
||||
articles: list[tuple[str, dict[str, Any]]],
|
||||
template: Template,
|
||||
output_dir: str,
|
||||
) -> None:
|
||||
"""Generate the archive page.
|
||||
|
||||
Parameters
|
||||
----------
|
||||
articles : list[list[str, dict]]
|
||||
articles
|
||||
List of articles. Each article has the destination path and a
|
||||
dictionary with the content.
|
||||
template : jinja2.Template instance
|
||||
output_dir : str
|
||||
template
|
||||
output_dir
|
||||
|
||||
"""
|
||||
archive = []
|
||||
@@ -374,78 +428,56 @@ def generate_archive(articles, template, output_dir):
|
||||
fh.write(result)
|
||||
|
||||
|
||||
def generate_tags(articles, tags_template, tag_template, output_dir):
|
||||
def generate_tags(
|
||||
articles: list[tuple[str, dict[str, Any]]],
|
||||
tags_template: Template,
|
||||
tag_template: Template,
|
||||
output_dir: str,
|
||||
) -> None:
|
||||
"""Generate the tags page.
|
||||
|
||||
Parameters
|
||||
----------
|
||||
articles : list[list[str, dict]]
|
||||
articles
|
||||
List of articles. Each article has the destination path and a
|
||||
dictionary with the content.
|
||||
tags_template, tag_template : jinja2.Template instance
|
||||
output_dir : str
|
||||
tags_template, tag_template
|
||||
output_dir
|
||||
|
||||
"""
|
||||
logger.info("Generating Tag-pages.")
|
||||
os.makedirs(f'{output_dir}/tags', exist_ok=True)
|
||||
|
||||
# get tags number of occurrences
|
||||
all_tags = {}
|
||||
all_tags: dict[str, int] = {}
|
||||
for _, context in articles:
|
||||
tags = context.get('tags', [])
|
||||
tags: list[str] = context.get('tags', [])
|
||||
for tag in tags:
|
||||
all_tags[tag] = all_tags.get(tag, 0) + 1
|
||||
# sort by occurrence
|
||||
all_tags = sorted(all_tags.items(), key=lambda x: x[1], reverse=True)
|
||||
taglist: list[tuple[str, int]] = sorted(
|
||||
all_tags.items(), key=lambda x: x[1], reverse=True
|
||||
)
|
||||
|
||||
result = tags_template.render(dict(tags=all_tags))
|
||||
result = tags_template.render(dict(tags=taglist))
|
||||
with open(f'{output_dir}/tags/index.html', 'w') as fh:
|
||||
fh.write(result)
|
||||
|
||||
# get tags and archive per tag
|
||||
all_tags = {}
|
||||
all_tags2: dict[str, list[dict[str, Any]]] = {}
|
||||
for dst, context in articles:
|
||||
tags = context.get('tags', [])
|
||||
for tag in tags:
|
||||
archive = all_tags.get(tag, [])
|
||||
archive: list[dict[str, Any]] = all_tags2.get(tag, [])
|
||||
entry = context.copy()
|
||||
entry['dst'] = dst
|
||||
archive.append(entry)
|
||||
all_tags[tag] = archive
|
||||
all_tags2[tag] = archive
|
||||
|
||||
for tag, archive in all_tags.items():
|
||||
for tag, archive in all_tags2.items():
|
||||
result = tag_template.render(dict(archive=archive, tag=tag))
|
||||
with open(f'{output_dir}/tags/{tag}.html', 'w') as fh:
|
||||
fh.write(result)
|
||||
|
||||
|
||||
def quickstart(args):
|
||||
"""Quickstart.
|
||||
|
||||
This method asks the user some questions and generates a
|
||||
configuration file that is needed in order to run blag.
|
||||
|
||||
Parameters
|
||||
----------
|
||||
args : argparse.Namespace
|
||||
|
||||
"""
|
||||
base_url = input("Hostname (and path) to the root? "
|
||||
"[https://example.com/]: ")
|
||||
title = input("Title of your website? ")
|
||||
description = input("Description of your website [John Does's Blog]? ")
|
||||
author = input("Author of your website [John Doe]? ")
|
||||
|
||||
config = configparser.ConfigParser()
|
||||
config['main'] = {
|
||||
'base_url': base_url,
|
||||
'title': title,
|
||||
'description': description,
|
||||
'author': author,
|
||||
}
|
||||
with open('config.ini', 'w') as fh:
|
||||
config.write(fh)
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
main()
|
||||
|
||||
@@ -6,12 +6,16 @@ site if necessary.
|
||||
|
||||
"""
|
||||
|
||||
# remove when we don't support py38 anymore
|
||||
from __future__ import annotations
|
||||
from typing import NoReturn
|
||||
import os
|
||||
import logging
|
||||
import time
|
||||
import multiprocessing
|
||||
from http.server import SimpleHTTPRequestHandler, HTTPServer
|
||||
from functools import partial
|
||||
import argparse
|
||||
|
||||
from blag import blag
|
||||
|
||||
@@ -19,7 +23,7 @@ from blag import blag
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def get_last_modified(dirs):
|
||||
def get_last_modified(dirs: list[str]) -> float:
|
||||
"""Get the last modified time.
|
||||
|
||||
This method recursively goes through `dirs` and returns the most
|
||||
@@ -27,16 +31,16 @@ def get_last_modified(dirs):
|
||||
|
||||
Parameters
|
||||
----------
|
||||
dirs : list[str]
|
||||
dirs
|
||||
list of directories to search
|
||||
|
||||
Returns
|
||||
-------
|
||||
int
|
||||
float
|
||||
most recent modification time found in `dirs`
|
||||
|
||||
"""
|
||||
last_mtime = 0
|
||||
last_mtime = 0.0
|
||||
|
||||
for dir in dirs:
|
||||
for root, dirs, files in os.walk(dir):
|
||||
@@ -48,7 +52,7 @@ def get_last_modified(dirs):
|
||||
return last_mtime
|
||||
|
||||
|
||||
def autoreload(args):
|
||||
def autoreload(args: argparse.Namespace) -> NoReturn:
|
||||
"""Start the autoreloader.
|
||||
|
||||
This method monitors the given directories for changes (i.e. the
|
||||
@@ -60,14 +64,15 @@ def autoreload(args):
|
||||
|
||||
Parameters
|
||||
----------
|
||||
args : argparse.Namespace
|
||||
args
|
||||
contains the input-, template- and static dir
|
||||
|
||||
"""
|
||||
dirs = [args.input_dir, args.template_dir, args.static_dir]
|
||||
logger.info(f'Monitoring {dirs} for changes...')
|
||||
# make sure we trigger the rebuild immediately when we enter the
|
||||
# loop to avoid serving stale contents
|
||||
last_mtime = 0
|
||||
last_mtime = 0.0
|
||||
while True:
|
||||
mtime = get_last_modified(dirs)
|
||||
if mtime > last_mtime:
|
||||
@@ -77,16 +82,19 @@ def autoreload(args):
|
||||
time.sleep(1)
|
||||
|
||||
|
||||
def serve(args):
|
||||
def serve(args: argparse.Namespace) -> None:
|
||||
"""Start the webserver and the autoreloader.
|
||||
|
||||
Parameters
|
||||
----------
|
||||
args : arparse.Namespace
|
||||
args
|
||||
contains the input-, template- and static dir
|
||||
|
||||
"""
|
||||
httpd = HTTPServer(('', 8000), partial(SimpleHTTPRequestHandler,
|
||||
directory=args.output_dir))
|
||||
httpd = HTTPServer(
|
||||
('', 8000),
|
||||
partial(SimpleHTTPRequestHandler, directory=args.output_dir),
|
||||
)
|
||||
proc = multiprocessing.Process(target=autoreload, args=(args,))
|
||||
proc.start()
|
||||
logger.info("\n\n Devserver Started -- visit http://localhost:8000\n")
|
||||
|
||||
@@ -5,9 +5,12 @@ processing.
|
||||
|
||||
"""
|
||||
|
||||
# remove when we don't support py38 anymore
|
||||
from __future__ import annotations
|
||||
from datetime import datetime
|
||||
import logging
|
||||
from urllib.parse import urlsplit, urlunsplit
|
||||
from xml.etree.ElementTree import Element
|
||||
|
||||
from markdown import Markdown
|
||||
from markdown.extensions import Extension
|
||||
@@ -17,7 +20,7 @@ from markdown.treeprocessors import Treeprocessor
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def markdown_factory():
|
||||
def markdown_factory() -> Markdown:
|
||||
"""Create a Markdown instance.
|
||||
|
||||
This method exists only to ensure we use the same Markdown instance
|
||||
@@ -30,15 +33,21 @@ def markdown_factory():
|
||||
"""
|
||||
md = Markdown(
|
||||
extensions=[
|
||||
'meta', 'fenced_code', 'codehilite', 'smarty',
|
||||
MarkdownLinkExtension()
|
||||
'meta',
|
||||
'fenced_code',
|
||||
'codehilite',
|
||||
'smarty',
|
||||
MarkdownLinkExtension(),
|
||||
],
|
||||
output_format='html5',
|
||||
output_format='html',
|
||||
)
|
||||
return md
|
||||
|
||||
|
||||
def convert_markdown(md, markdown):
|
||||
def convert_markdown(
|
||||
md: Markdown,
|
||||
markdown: str,
|
||||
) -> tuple[str, dict[str, str]]:
|
||||
"""Convert markdown into html and extract meta data.
|
||||
|
||||
Some meta data is treated special:
|
||||
@@ -48,18 +57,20 @@ def convert_markdown(md, markdown):
|
||||
|
||||
Parameters
|
||||
----------
|
||||
md : markdown.Markdown instance
|
||||
markdown : str
|
||||
md
|
||||
the Markdown instance
|
||||
markdown
|
||||
the markdown text that should be converted
|
||||
|
||||
Returns
|
||||
-------
|
||||
str, dict :
|
||||
str, dict[str, str]
|
||||
html and metadata
|
||||
|
||||
"""
|
||||
md.reset()
|
||||
content = md.convert(markdown)
|
||||
meta = md.Meta
|
||||
meta = md.Meta # type: ignore
|
||||
|
||||
# markdowns metadata consists as list of strings -- one item per
|
||||
# line. let's convert into single strings.
|
||||
@@ -83,24 +94,26 @@ def convert_markdown(md, markdown):
|
||||
|
||||
|
||||
class MarkdownLinkTreeprocessor(Treeprocessor):
|
||||
"""Converts relative links to .md files to .html
|
||||
"""Converts relative links to .md files to .html"""
|
||||
|
||||
"""
|
||||
|
||||
def run(self, root):
|
||||
def run(self, root: Element) -> Element:
|
||||
for element in root.iter():
|
||||
if element.tag == 'a':
|
||||
url = element.get('href')
|
||||
# element.get could also return None, we haven't seen this so
|
||||
# far, so lets wait if we raise this
|
||||
assert url is not None
|
||||
url = str(url)
|
||||
converted = self.convert(url)
|
||||
element.set('href', converted)
|
||||
return root
|
||||
|
||||
def convert(self, url):
|
||||
def convert(self, url: str) -> str:
|
||||
scheme, netloc, path, query, fragment = urlsplit(url)
|
||||
logger.debug(
|
||||
f'{url}: {scheme=} {netloc=} {path=} {query=} {fragment=}'
|
||||
)
|
||||
if (scheme or netloc or not path):
|
||||
if scheme or netloc or not path:
|
||||
return url
|
||||
if path.endswith('.md'):
|
||||
path = path[:-3] + '.html'
|
||||
@@ -110,10 +123,11 @@ class MarkdownLinkTreeprocessor(Treeprocessor):
|
||||
|
||||
|
||||
class MarkdownLinkExtension(Extension):
|
||||
"""markdown.extension that converts relative .md- to .html-links.
|
||||
"""markdown.extension that converts relative .md- to .html-links."""
|
||||
|
||||
"""
|
||||
def extendMarkdown(self, md):
|
||||
def extendMarkdown(self, md: Markdown) -> None:
|
||||
md.treeprocessors.register(
|
||||
MarkdownLinkTreeprocessor(md), 'mdlink', 0,
|
||||
MarkdownLinkTreeprocessor(md),
|
||||
'mdlink',
|
||||
0,
|
||||
)
|
||||
|
||||
73
blag/quickstart.py
Normal file
73
blag/quickstart.py
Normal file
@@ -0,0 +1,73 @@
|
||||
"""Helper methods for blag's quickstart command.
|
||||
|
||||
"""
|
||||
|
||||
# remove when we don't support py38 anymore
|
||||
from __future__ import annotations
|
||||
import configparser
|
||||
import argparse
|
||||
|
||||
|
||||
def get_input(question: str, default: str) -> str:
|
||||
"""Prompt for user input.
|
||||
|
||||
This is a wrapper around the input-builtin. It will show the default answer
|
||||
in the prompt and -- if no answer was given -- use the default.
|
||||
|
||||
Parameters
|
||||
----------
|
||||
question
|
||||
the question the user is presented
|
||||
default
|
||||
the default value that will be used if no answer was given
|
||||
|
||||
Returns
|
||||
-------
|
||||
str
|
||||
the answer
|
||||
|
||||
"""
|
||||
reply = input(f"{question} [{default}]: ")
|
||||
if not reply:
|
||||
reply = default
|
||||
return reply
|
||||
|
||||
|
||||
def quickstart(args: argparse.Namespace | None) -> None:
|
||||
"""Quickstart.
|
||||
|
||||
This method asks the user some questions and generates a
|
||||
configuration file that is needed in order to run blag.
|
||||
|
||||
Parameters
|
||||
----------
|
||||
args
|
||||
not used
|
||||
|
||||
"""
|
||||
base_url = get_input(
|
||||
"Hostname (and path) to the root?",
|
||||
"https://example.com/",
|
||||
)
|
||||
title = get_input(
|
||||
"Title of your website?",
|
||||
"My little blog",
|
||||
)
|
||||
description = get_input(
|
||||
"Description of your website?",
|
||||
"John Doe's Blog",
|
||||
)
|
||||
author = get_input(
|
||||
"Author of your website",
|
||||
"John Doe",
|
||||
)
|
||||
|
||||
config = configparser.ConfigParser()
|
||||
config['main'] = {
|
||||
'base_url': base_url,
|
||||
'title': title,
|
||||
'description': description,
|
||||
'author': author,
|
||||
}
|
||||
with open('config.ini', 'w') as fh:
|
||||
config.write(fh)
|
||||
@@ -7,6 +7,11 @@
|
||||
|
||||
{% if entry.title %}
|
||||
<h1><a href="{{entry.dst}}">{{entry.title}}</a></h1>
|
||||
|
||||
{% if entry.description %}
|
||||
<p>— {{ entry.description }}</p>
|
||||
{% endif %}
|
||||
|
||||
{% endif %}
|
||||
|
||||
<p>Written on {{ entry.date.date() }}.</p>
|
||||
|
||||
@@ -3,5 +3,26 @@
|
||||
{% block title %}{{ title }}{% endblock %}
|
||||
|
||||
{% block content %}
|
||||
{{ content }}
|
||||
|
||||
{% if title %}
|
||||
<h2>{{ title }}</h2>
|
||||
{% endif %}
|
||||
|
||||
<aside>
|
||||
<p>published on {{ date.date() }}
|
||||
|
||||
{% if tags %}
|
||||
· tagged with
|
||||
{% for tag in tags|sort(case_sensitive=true) %}
|
||||
{%- if not loop.first and not loop.last %}, {% endif -%}
|
||||
{%- if loop.last and not loop.first %} and {% endif %}
|
||||
<a href="/tags/{{ tag }}.html">#{{ tag }}</a>
|
||||
{%- endfor %}
|
||||
{% endif %}
|
||||
</p>
|
||||
</aside>
|
||||
|
||||
|
||||
{{ content }}
|
||||
|
||||
{% endblock %}
|
||||
|
||||
@@ -7,16 +7,20 @@
|
||||
<meta name="author" content="{{ site.author }}">
|
||||
{%- if description %}
|
||||
<meta name="description" content="{{ description }}">
|
||||
{% endif %}
|
||||
<title>{% block title %}{% endblock %}</title>
|
||||
{%- else %}
|
||||
<meta name="description" content="{{ site.description }}">
|
||||
{%- endif %}
|
||||
<title>{% block title %}{% endblock %} | {{ site.description }}</title>
|
||||
</head>
|
||||
|
||||
<body>
|
||||
<header>
|
||||
<h1>A Blog</h1>
|
||||
<h1><a href="/">{{ site.title }}</a></h1>
|
||||
<nav>
|
||||
<h2>{{ site.description }}</h2>
|
||||
<ul>
|
||||
<li><a href="/">Blog</a></li>
|
||||
<li><a href="/tags/">Tags</a></li>
|
||||
<li><a href="/atom.xml">Atom Feed</a></li>
|
||||
</ul>
|
||||
</nav>
|
||||
@@ -26,6 +30,7 @@
|
||||
{% block content %}
|
||||
{% endblock %}
|
||||
</main>
|
||||
|
||||
</body>
|
||||
|
||||
</html>
|
||||
|
||||
@@ -7,6 +7,11 @@
|
||||
|
||||
{% if entry.title %}
|
||||
<h1><a href="/{{entry.dst}}">{{entry.title}}</a></h1>
|
||||
|
||||
{% if entry.description %}
|
||||
<p>— {{ entry.description }}</p>
|
||||
{% endif %}
|
||||
|
||||
{% endif %}
|
||||
|
||||
<p>Written on {{ entry.date.date() }}.</p>
|
||||
|
||||
@@ -1 +1 @@
|
||||
__VERSION__ = '1.2.0'
|
||||
__VERSION__ = '1.4.0'
|
||||
|
||||
1
debian/blag-doc.docs
vendored
Normal file
1
debian/blag-doc.docs
vendored
Normal file
@@ -0,0 +1 @@
|
||||
build/html/
|
||||
1
debian/blag.install
vendored
Normal file
1
debian/blag.install
vendored
Normal file
@@ -0,0 +1 @@
|
||||
build/man/blag.1 /usr/share/man/man1
|
||||
29
debian/changelog
vendored
Normal file
29
debian/changelog
vendored
Normal file
@@ -0,0 +1,29 @@
|
||||
blag (1.4.0) unstable; urgency=medium
|
||||
|
||||
* added type hints and mypy --strict to test suite
|
||||
* improved default template
|
||||
|
||||
-- Bastian Venthur <venthur@debian.org> Thu, 01 Sep 2022 18:59:11 +0200
|
||||
|
||||
blag (1.3.2) unstable; urgency=medium
|
||||
|
||||
* Added --version option
|
||||
* Improved quickstart:
|
||||
* respective default answers will be written to config if user provided no
|
||||
answer
|
||||
* added tests for quickstart
|
||||
* Added some test cases for the MarkdownLinktreeProcessor
|
||||
|
||||
-- Bastian Venthur <venthur@debian.org> Wed, 29 Jun 2022 21:27:15 +0200
|
||||
|
||||
blag (1.3.1) unstable; urgency=medium
|
||||
|
||||
* re-upload with man pages
|
||||
|
||||
-- Bastian Venthur <venthur@debian.org> Fri, 10 Jun 2022 07:26:19 +0200
|
||||
|
||||
blag (1.3.0) unstable; urgency=medium
|
||||
|
||||
* Initial release. Closes: #1012584
|
||||
|
||||
-- Bastian Venthur <venthur@debian.org> Sun, 05 Jun 2022 15:20:48 +0200
|
||||
59
debian/control
vendored
Normal file
59
debian/control
vendored
Normal file
@@ -0,0 +1,59 @@
|
||||
Source: blag
|
||||
Section: python
|
||||
Priority: optional
|
||||
Maintainer: Bastian Venthur <venthur@debian.org>
|
||||
Rules-Requires-Root: no
|
||||
Build-Depends:
|
||||
debhelper-compat (= 13),
|
||||
dh-sequence-sphinxdoc,
|
||||
dh-sequence-python3,
|
||||
dh-python,
|
||||
python3-setuptools,
|
||||
python3-all,
|
||||
python3-markdown,
|
||||
python3-feedgenerator,
|
||||
python3-jinja2,
|
||||
python3-pygments,
|
||||
python3-pytest,
|
||||
python3-pytest-cov,
|
||||
python3-sphinx,
|
||||
#Testsuite: autopkgtest-pkg-python
|
||||
Standards-Version: 4.6.0.1
|
||||
Homepage: https://github.com/venthur/blag
|
||||
Vcs-Browser: https://github.com/venthur/blag
|
||||
Vcs-Git: https://github.com/venthur/blag.git
|
||||
|
||||
Package: blag
|
||||
Architecture: all
|
||||
Depends:
|
||||
${python3:Depends},
|
||||
${misc:Depends},
|
||||
Suggests:
|
||||
python-blag-doc,
|
||||
Description: Blog-aware, static site generator
|
||||
Blag is a blog-aware, static site generator, written in Python. It supports
|
||||
the following features:
|
||||
* Write content in Markdown
|
||||
* Theming support using Jinja2 templates
|
||||
* Generation of Atom feeds for blog content
|
||||
* Fenced code blocks and syntax highlighting using Pygments
|
||||
* Integrated devserver
|
||||
* Available on PyPI
|
||||
|
||||
Package: blag-doc
|
||||
Section: doc
|
||||
Architecture: all
|
||||
Depends:
|
||||
${sphinxdoc:Depends},
|
||||
${misc:Depends},
|
||||
Description: Blog-aware, static site generator (documentation)
|
||||
Blag is a blog-aware, static site generator, written in Python. It supports
|
||||
the following features:
|
||||
* Write content in Markdown
|
||||
* Theming support using Jinja2 templates
|
||||
* Generation of Atom feeds for blog content
|
||||
* Fenced code blocks and syntax highlighting using Pygments
|
||||
* Integrated devserver
|
||||
* Available on PyPI
|
||||
.
|
||||
This is the common documentation package.
|
||||
35
debian/copyright
vendored
Normal file
35
debian/copyright
vendored
Normal file
@@ -0,0 +1,35 @@
|
||||
Format: https://www.debian.org/doc/packaging-manuals/copyright-format/1.0/
|
||||
Source: https://github.com/venthur/blag
|
||||
Upstream-Name: blag
|
||||
Upstream-Contact: Bastian Venthur venthur@debian.org
|
||||
|
||||
Files:
|
||||
*
|
||||
Copyright:
|
||||
2022 Bastian Venthur venthur@debian.org
|
||||
License: MIT
|
||||
|
||||
Files:
|
||||
debian/*
|
||||
Copyright:
|
||||
2022 Bastian Venthur <venthur@debian.org>
|
||||
License: MIT
|
||||
|
||||
License: MIT
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
.
|
||||
The above copyright notice and this permission notice shall be included in
|
||||
all copies or substantial portions of the Software.
|
||||
.
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||
SOFTWARE.
|
||||
25
debian/rules
vendored
Executable file
25
debian/rules
vendored
Executable file
@@ -0,0 +1,25 @@
|
||||
#!/usr/bin/make -f
|
||||
|
||||
# See debhelper(7) (uncomment to enable).
|
||||
# Output every command that modifies files on the build system.
|
||||
#export DH_VERBOSE = 1
|
||||
|
||||
export PYBUILD_DESTDIR=debian/blag
|
||||
export PYBUILD_TEST_ARGS=--no-cov
|
||||
export PYBUILD_NAME=blag
|
||||
|
||||
%:
|
||||
dh $@ --with python3,sphinxdoc --buildsystem=pybuild
|
||||
|
||||
|
||||
# If you need to rebuild the Sphinx documentation:
|
||||
# Add sphinxdoc to the dh --with line.
|
||||
#
|
||||
# And uncomment the following lines.
|
||||
execute_after_dh_auto_build-indep: export http_proxy=127.0.0.1:9
|
||||
execute_after_dh_auto_build-indep: export https_proxy=127.0.0.1:9
|
||||
execute_after_dh_auto_build-indep:
|
||||
PYTHONPATH=. python3 -m sphinx -N -bhtml \
|
||||
docs/ build/html # HTML generator
|
||||
PYTHONPATH=. python3 -m sphinx -N -bman \
|
||||
docs/ build/man # Manpage generator
|
||||
1
debian/source/format
vendored
Normal file
1
debian/source/format
vendored
Normal file
@@ -0,0 +1 @@
|
||||
3.0 (native)
|
||||
@@ -9,3 +9,4 @@ API
|
||||
blag.blag
|
||||
blag.markdown
|
||||
blag.devserver
|
||||
blag.quickstart
|
||||
|
||||
@@ -1,6 +1,8 @@
|
||||
sphinx==4.2.0
|
||||
twine==3.5.0
|
||||
wheel==0.37.0
|
||||
pytest==6.2.5
|
||||
sphinx==5.1.1
|
||||
twine==4.0.1
|
||||
wheel==0.37.1
|
||||
pytest==7.1.2
|
||||
pytest-cov==3.0.0
|
||||
flake8==4.0.1
|
||||
flake8==5.0.4
|
||||
mypy==0.971
|
||||
types-markdown==3.4.1
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
markdown==3.3.4
|
||||
markdown==3.4.1
|
||||
feedgenerator==2.0.0
|
||||
jinja2==3.0.2
|
||||
pygments==2.10.0
|
||||
jinja2==3.1.2
|
||||
pygments==2.13.0
|
||||
|
||||
@@ -7,3 +7,10 @@ addopts =
|
||||
|
||||
[flake8]
|
||||
exclude = venv,build,docs
|
||||
|
||||
[mypy]
|
||||
files = blag,tests
|
||||
strict = True
|
||||
|
||||
[mypy-feedgenerator.*]
|
||||
ignore_missing_imports = True
|
||||
|
||||
@@ -1,13 +1,18 @@
|
||||
# remove when we don't support py38 anymore
|
||||
from __future__ import annotations
|
||||
from argparse import Namespace
|
||||
from typing import Iterator, Callable
|
||||
from tempfile import TemporaryDirectory
|
||||
import os
|
||||
|
||||
import pytest
|
||||
from jinja2 import Environment, Template
|
||||
|
||||
from blag import blag
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def environment():
|
||||
def environment() -> Iterator[Environment]:
|
||||
site = {
|
||||
'base_url': 'site base_url',
|
||||
'title': 'site title',
|
||||
@@ -19,35 +24,33 @@ def environment():
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def page_template(environment):
|
||||
def page_template(environment: Environment) -> Iterator[Template]:
|
||||
yield environment.get_template('page.html')
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def article_template(environment):
|
||||
def article_template(environment: Environment) -> Iterator[Template]:
|
||||
yield environment.get_template('article.html')
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def archive_template(environment):
|
||||
def archive_template(environment: Environment) -> Iterator[Template]:
|
||||
yield environment.get_template('archive.html')
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def tags_template(environment):
|
||||
def tags_template(environment: Environment) -> Iterator[Template]:
|
||||
yield environment.get_template('tags.html')
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def tag_template(environment):
|
||||
def tag_template(environment: Environment) -> Iterator[Template]:
|
||||
yield environment.get_template('tag.html')
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def cleandir():
|
||||
"""Create a temporary workind directory and cwd.
|
||||
|
||||
"""
|
||||
def cleandir() -> Iterator[str]:
|
||||
"""Create a temporary workind directory and cwd."""
|
||||
config = """
|
||||
[main]
|
||||
base_url = https://example.com/
|
||||
@@ -70,17 +73,12 @@ author = a. u. thor
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def args(cleandir):
|
||||
def args(cleandir: Callable[[], Iterator[str]]) -> Iterator[Namespace]:
|
||||
|
||||
class NameSpace:
|
||||
def __init__(self, **kwargs):
|
||||
for name in kwargs:
|
||||
setattr(self, name, kwargs[name])
|
||||
|
||||
args = NameSpace(
|
||||
input_dir='content',
|
||||
output_dir='build',
|
||||
static_dir='static',
|
||||
template_dir='templates',
|
||||
args = Namespace(
|
||||
input_dir='content',
|
||||
output_dir='build',
|
||||
static_dir='static',
|
||||
template_dir='templates',
|
||||
)
|
||||
yield args
|
||||
|
||||
@@ -1,41 +1,53 @@
|
||||
# remove when we don't support py38 anymore
|
||||
from __future__ import annotations
|
||||
from tempfile import TemporaryDirectory
|
||||
import os
|
||||
from datetime import datetime
|
||||
from typing import Any
|
||||
from argparse import Namespace
|
||||
|
||||
import pytest
|
||||
from pytest import CaptureFixture, LogCaptureFixture
|
||||
from jinja2 import Template
|
||||
|
||||
from blag import __VERSION__
|
||||
from blag import blag
|
||||
|
||||
|
||||
def test_generate_feed(cleandir):
|
||||
articles = []
|
||||
def test_generate_feed(cleandir: str) -> None:
|
||||
articles: list[tuple[str, dict[str, Any]]] = []
|
||||
blag.generate_feed(articles, 'build', ' ', ' ', ' ', ' ')
|
||||
assert os.path.exists('build/atom.xml')
|
||||
|
||||
|
||||
def test_feed(cleandir):
|
||||
articles = [
|
||||
[
|
||||
def test_feed(cleandir: str) -> None:
|
||||
articles: list[tuple[str, dict[str, Any]]] = [
|
||||
(
|
||||
'dest1.html',
|
||||
{
|
||||
'title': 'title1',
|
||||
'date': datetime(2019, 6, 6),
|
||||
'content': 'content1',
|
||||
}
|
||||
],
|
||||
[
|
||||
},
|
||||
),
|
||||
(
|
||||
'dest2.html',
|
||||
{
|
||||
'title': 'title2',
|
||||
'date': datetime(1980, 5, 9),
|
||||
'content': 'content2',
|
||||
}
|
||||
],
|
||||
|
||||
},
|
||||
),
|
||||
]
|
||||
|
||||
blag.generate_feed(articles, 'build', 'https://example.com/',
|
||||
'blog title', 'blog description', 'blog author')
|
||||
blag.generate_feed(
|
||||
articles,
|
||||
'build',
|
||||
'https://example.com/',
|
||||
'blog title',
|
||||
'blog description',
|
||||
'blog author',
|
||||
)
|
||||
with open('build/atom.xml') as fh:
|
||||
feed = fh.read()
|
||||
|
||||
@@ -60,18 +72,20 @@ def test_feed(cleandir):
|
||||
assert '<link href="https://example.com/dest2.html"' in feed
|
||||
|
||||
|
||||
def test_generate_feed_with_description(cleandir):
|
||||
def test_generate_feed_with_description(cleandir: str) -> None:
|
||||
# if a description is provided, it will be used as the summary in
|
||||
# the feed, otherwise we simply use the title of the article
|
||||
articles = [[
|
||||
'dest.html',
|
||||
{
|
||||
'title': 'title',
|
||||
'description': 'description',
|
||||
'date': datetime(2019, 6, 6),
|
||||
'content': 'content',
|
||||
}
|
||||
]]
|
||||
articles: list[tuple[str, dict[str, Any]]] = [
|
||||
(
|
||||
'dest.html',
|
||||
{
|
||||
'title': 'title',
|
||||
'description': 'description',
|
||||
'date': datetime(2019, 6, 6),
|
||||
'content': 'content',
|
||||
},
|
||||
)
|
||||
]
|
||||
blag.generate_feed(articles, 'build', ' ', ' ', ' ', ' ')
|
||||
|
||||
with open('build/atom.xml') as fh:
|
||||
@@ -83,7 +97,7 @@ def test_generate_feed_with_description(cleandir):
|
||||
assert '<content type="html">content' in feed
|
||||
|
||||
|
||||
def test_parse_args_build():
|
||||
def test_parse_args_build() -> None:
|
||||
# test default args
|
||||
args = blag.parse_args(['build'])
|
||||
assert args.input_dir == 'content'
|
||||
@@ -116,7 +130,7 @@ def test_parse_args_build():
|
||||
assert args.static_dir == 'foo'
|
||||
|
||||
|
||||
def test_get_config():
|
||||
def test_get_config() -> None:
|
||||
config = """
|
||||
[main]
|
||||
base_url = https://example.com/
|
||||
@@ -138,10 +152,9 @@ author = a. u. thor
|
||||
|
||||
# a missing required config causes a sys.exit
|
||||
for x in 'base_url', 'title', 'description', 'author':
|
||||
config2 = '\n'.join([line
|
||||
for line
|
||||
in config.splitlines()
|
||||
if not line.startswith(x)])
|
||||
config2 = '\n'.join(
|
||||
[line for line in config.splitlines() if not line.startswith(x)]
|
||||
)
|
||||
with TemporaryDirectory() as dir:
|
||||
configfile = f'{dir}/config.ini'
|
||||
with open(configfile, 'w') as fh:
|
||||
@@ -166,17 +179,18 @@ author = a. u. thor
|
||||
assert config_parsed['base_url'] == 'https://example.com/'
|
||||
|
||||
|
||||
def test_environment_factory():
|
||||
globals_ = {
|
||||
'foo': 'bar',
|
||||
'test': 'me'
|
||||
}
|
||||
def test_environment_factory() -> None:
|
||||
globals_: dict[str, object] = {'foo': 'bar', 'test': 'me'}
|
||||
env = blag.environment_factory(globals_=globals_)
|
||||
assert env.globals['foo'] == 'bar'
|
||||
assert env.globals['test'] == 'me'
|
||||
|
||||
|
||||
def test_process_markdown(cleandir, page_template, article_template):
|
||||
def test_process_markdown(
|
||||
cleandir: str,
|
||||
page_template: Template,
|
||||
article_template: Template,
|
||||
) -> None:
|
||||
page1 = """\
|
||||
title: some page
|
||||
|
||||
@@ -202,17 +216,12 @@ foo bar
|
||||
|
||||
convertibles = []
|
||||
for i, txt in enumerate((page1, article1, article2)):
|
||||
i = str(i)
|
||||
with open(f'content/{i}', 'w') as fh:
|
||||
with open(f'content/{str(i)}', 'w') as fh:
|
||||
fh.write(txt)
|
||||
convertibles.append([i, i])
|
||||
convertibles.append((str(i), str(i)))
|
||||
|
||||
articles, pages = blag.process_markdown(
|
||||
convertibles,
|
||||
'content',
|
||||
'build',
|
||||
page_template,
|
||||
article_template
|
||||
convertibles, 'content', 'build', page_template, article_template
|
||||
)
|
||||
|
||||
assert isinstance(articles, list)
|
||||
@@ -230,7 +239,7 @@ foo bar
|
||||
assert 'content' in context
|
||||
|
||||
|
||||
def test_build(args):
|
||||
def test_build(args: Namespace) -> None:
|
||||
page1 = """\
|
||||
title: some page
|
||||
|
||||
@@ -259,10 +268,9 @@ foo bar
|
||||
# write some convertibles
|
||||
convertibles = []
|
||||
for i, txt in enumerate((page1, article1, article2)):
|
||||
i = str(i)
|
||||
with open(f'{args.input_dir}/{i}.md', 'w') as fh:
|
||||
with open(f'{args.input_dir}/{str(i)}.md', 'w') as fh:
|
||||
fh.write(txt)
|
||||
convertibles.append([i, i])
|
||||
convertibles.append((str(i), str(i)))
|
||||
|
||||
# some static files
|
||||
with open(f'{args.static_dir}/test', 'w') as fh:
|
||||
@@ -274,6 +282,40 @@ foo bar
|
||||
|
||||
blag.build(args)
|
||||
|
||||
# test existence of the three converted files
|
||||
for i in range(3):
|
||||
assert os.path.exists(f'{args.output_dir}/{i}.html')
|
||||
# ... static file
|
||||
assert os.path.exists(f'{args.output_dir}/test')
|
||||
# ... directory
|
||||
assert os.path.exists(f'{args.output_dir}/testdir/test')
|
||||
# ... feed
|
||||
assert os.path.exists(f'{args.output_dir}/atom.xml')
|
||||
# ... archive
|
||||
assert os.path.exists(f'{args.output_dir}/index.html')
|
||||
# ... tags
|
||||
assert os.path.exists(f'{args.output_dir}/tags/index.html')
|
||||
assert os.path.exists(f'{args.output_dir}/tags/foo.html')
|
||||
assert os.path.exists(f'{args.output_dir}/tags/bar.html')
|
||||
|
||||
def test_main(cleandir):
|
||||
|
||||
def test_main(cleandir: str) -> None:
|
||||
blag.main(['build'])
|
||||
|
||||
|
||||
def test_cli_version(capsys: CaptureFixture[str]) -> None:
|
||||
with pytest.raises(SystemExit) as ex:
|
||||
blag.main(['--version'])
|
||||
# normal system exit
|
||||
assert ex.value.code == 0
|
||||
# proper version reported
|
||||
out, _ = capsys.readouterr()
|
||||
assert __VERSION__ in out
|
||||
|
||||
|
||||
def test_cli_verbose(cleandir: str, caplog: LogCaptureFixture) -> None:
|
||||
blag.main(['build'])
|
||||
assert 'DEBUG' not in caplog.text
|
||||
|
||||
blag.main(['--verbose', 'build'])
|
||||
assert 'DEBUG' in caplog.text
|
||||
|
||||
@@ -1,12 +1,15 @@
|
||||
# remove when we don't support py38 anymore
|
||||
from __future__ import annotations
|
||||
import time
|
||||
import threading
|
||||
from argparse import Namespace
|
||||
|
||||
import pytest
|
||||
|
||||
from blag import devserver
|
||||
|
||||
|
||||
def test_get_last_modified(cleandir):
|
||||
def test_get_last_modified(cleandir: str) -> None:
|
||||
# take initial time
|
||||
t1 = devserver.get_last_modified(['content'])
|
||||
|
||||
@@ -24,14 +27,16 @@ def test_get_last_modified(cleandir):
|
||||
assert t2 == t3
|
||||
|
||||
|
||||
def test_autoreload_builds_immediately(args):
|
||||
def test_autoreload_builds_immediately(args: Namespace) -> None:
|
||||
# create a dummy file that can be build
|
||||
with open('content/test.md', 'w') as fh:
|
||||
fh.write('boo')
|
||||
|
||||
t = threading.Thread(target=devserver.autoreload,
|
||||
args=(args, ),
|
||||
daemon=True,)
|
||||
t = threading.Thread(
|
||||
target=devserver.autoreload,
|
||||
args=(args,),
|
||||
daemon=True,
|
||||
)
|
||||
t0 = devserver.get_last_modified(['build'])
|
||||
t.start()
|
||||
# try for 5 seconds...
|
||||
@@ -44,11 +49,15 @@ def test_autoreload_builds_immediately(args):
|
||||
assert t1 > t0
|
||||
|
||||
|
||||
@pytest.mark.filterwarnings("ignore::pytest.PytestUnhandledThreadExceptionWarning") # noqa
|
||||
def test_autoreload(args):
|
||||
t = threading.Thread(target=devserver.autoreload,
|
||||
args=(args, ),
|
||||
daemon=True,)
|
||||
@pytest.mark.filterwarnings(
|
||||
"ignore::pytest.PytestUnhandledThreadExceptionWarning"
|
||||
)
|
||||
def test_autoreload(args: Namespace) -> None:
|
||||
t = threading.Thread(
|
||||
target=devserver.autoreload,
|
||||
args=(args,),
|
||||
daemon=True,
|
||||
)
|
||||
t.start()
|
||||
|
||||
t0 = devserver.get_last_modified(['build'])
|
||||
|
||||
@@ -1,4 +1,7 @@
|
||||
# remove when we don't support py38 anymore
|
||||
from __future__ import annotations
|
||||
from datetime import datetime
|
||||
from typing import Any
|
||||
|
||||
import pytest
|
||||
import markdown
|
||||
@@ -6,52 +9,77 @@ import markdown
|
||||
from blag.markdown import convert_markdown, markdown_factory
|
||||
|
||||
|
||||
@pytest.mark.parametrize("input_, expected", [
|
||||
# inline
|
||||
('[test](test.md)', 'test.html'),
|
||||
('[test](test.md "test")', 'test.html'),
|
||||
('[test](a/test.md)', 'a/test.html'),
|
||||
('[test](a/test.md "test")', 'a/test.html'),
|
||||
('[test](/test.md)', '/test.html'),
|
||||
('[test](/test.md "test")', '/test.html'),
|
||||
('[test](/a/test.md)', '/a/test.html'),
|
||||
('[test](/a/test.md "test")', '/a/test.html'),
|
||||
# reference
|
||||
('[test][]\n[test]: test.md ''', 'test.html'),
|
||||
('[test][]\n[test]: test.md "test"', 'test.html'),
|
||||
('[test][]\n[test]: a/test.md', 'a/test.html'),
|
||||
('[test][]\n[test]: a/test.md "test"', 'a/test.html'),
|
||||
('[test][]\n[test]: /test.md', '/test.html'),
|
||||
('[test][]\n[test]: /test.md "test"', '/test.html'),
|
||||
('[test][]\n[test]: /a/test.md', '/a/test.html'),
|
||||
('[test][]\n[test]: /a/test.md "test"', '/a/test.html'),
|
||||
])
|
||||
def test_convert_markdown_links(input_, expected):
|
||||
@pytest.mark.parametrize(
|
||||
"input_, expected",
|
||||
[
|
||||
# inline
|
||||
('[test](test.md)', 'test.html'),
|
||||
('[test](test.md "test")', 'test.html'),
|
||||
('[test](a/test.md)', 'a/test.html'),
|
||||
('[test](a/test.md "test")', 'a/test.html'),
|
||||
('[test](/test.md)', '/test.html'),
|
||||
('[test](/test.md "test")', '/test.html'),
|
||||
('[test](/a/test.md)', '/a/test.html'),
|
||||
('[test](/a/test.md "test")', '/a/test.html'),
|
||||
# reference
|
||||
('[test][]\n[test]: test.md ' '', 'test.html'),
|
||||
('[test][]\n[test]: test.md "test"', 'test.html'),
|
||||
('[test][]\n[test]: a/test.md', 'a/test.html'),
|
||||
('[test][]\n[test]: a/test.md "test"', 'a/test.html'),
|
||||
('[test][]\n[test]: /test.md', '/test.html'),
|
||||
('[test][]\n[test]: /test.md "test"', '/test.html'),
|
||||
('[test][]\n[test]: /a/test.md', '/a/test.html'),
|
||||
('[test][]\n[test]: /a/test.md "test"', '/a/test.html'),
|
||||
],
|
||||
)
|
||||
def test_convert_markdown_links(input_: str, expected: str) -> None:
|
||||
md = markdown_factory()
|
||||
html, _ = convert_markdown(md, input_)
|
||||
assert expected in html
|
||||
|
||||
|
||||
@pytest.mark.parametrize("input_, expected", [
|
||||
('foo: bar', {'foo': 'bar'}),
|
||||
('foo: those are several words', {'foo': 'those are several words'}),
|
||||
('tags: this, is, a, test\n', {'tags': ['this', 'is', 'a', 'test']}),
|
||||
('tags: this, IS, a, test', {'tags': ['this', 'is', 'a', 'test']}),
|
||||
('date: 2020-01-01 12:10', {'date':
|
||||
datetime(2020, 1, 1, 12, 10).astimezone()}),
|
||||
])
|
||||
def test_convert_metadata(input_, expected):
|
||||
@pytest.mark.parametrize(
|
||||
"input_, expected",
|
||||
[
|
||||
# scheme
|
||||
('[test](https://)', 'https://'),
|
||||
# netloc
|
||||
('[test](//test.md)', '//test.md'),
|
||||
# no path
|
||||
('[test]()', ''),
|
||||
],
|
||||
)
|
||||
def test_dont_convert_normal_links(input_: str, expected: str) -> None:
|
||||
md = markdown_factory()
|
||||
html, _ = convert_markdown(md, input_)
|
||||
assert expected in html
|
||||
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
"input_, expected",
|
||||
[
|
||||
('foo: bar', {'foo': 'bar'}),
|
||||
('foo: those are several words', {'foo': 'those are several words'}),
|
||||
('tags: this, is, a, test\n', {'tags': ['this', 'is', 'a', 'test']}),
|
||||
('tags: this, IS, a, test', {'tags': ['this', 'is', 'a', 'test']}),
|
||||
(
|
||||
'date: 2020-01-01 12:10',
|
||||
{'date': datetime(2020, 1, 1, 12, 10).astimezone()},
|
||||
),
|
||||
],
|
||||
)
|
||||
def test_convert_metadata(input_: str, expected: dict[str, Any]) -> None:
|
||||
md = markdown_factory()
|
||||
_, meta = convert_markdown(md, input_)
|
||||
assert expected == meta
|
||||
|
||||
|
||||
def test_markdown_factory():
|
||||
def test_markdown_factory() -> None:
|
||||
md = markdown_factory()
|
||||
assert isinstance(md, markdown.Markdown)
|
||||
|
||||
|
||||
def test_smarty():
|
||||
def test_smarty() -> None:
|
||||
md = markdown_factory()
|
||||
|
||||
md1 = """
|
||||
@@ -65,7 +93,7 @@ this --- is -- a test ...
|
||||
assert 'hellip' in html
|
||||
|
||||
|
||||
def test_smarty_code():
|
||||
def test_smarty_code() -> None:
|
||||
md = markdown_factory()
|
||||
|
||||
md1 = """
|
||||
|
||||
29
tests/test_quickstart.py
Normal file
29
tests/test_quickstart.py
Normal file
@@ -0,0 +1,29 @@
|
||||
# remove when we don't support py38 anymore
|
||||
from __future__ import annotations
|
||||
|
||||
from pytest import MonkeyPatch
|
||||
|
||||
from blag.quickstart import get_input, quickstart
|
||||
|
||||
|
||||
def test_get_input_default_answer(monkeypatch: MonkeyPatch) -> None:
|
||||
monkeypatch.setattr('builtins.input', lambda x: '')
|
||||
answer = get_input("foo", "bar")
|
||||
assert answer == 'bar'
|
||||
|
||||
|
||||
def test_get_input(monkeypatch: MonkeyPatch) -> None:
|
||||
monkeypatch.setattr('builtins.input', lambda x: 'baz')
|
||||
answer = get_input("foo", "bar")
|
||||
assert answer == 'baz'
|
||||
|
||||
|
||||
def test_quickstart(cleandir: str, monkeypatch: MonkeyPatch) -> None:
|
||||
monkeypatch.setattr('builtins.input', lambda x: 'foo')
|
||||
quickstart(None)
|
||||
with open('config.ini', 'r') as fh:
|
||||
data = fh.read()
|
||||
assert 'base_url = foo' in data
|
||||
assert 'title = foo' in data
|
||||
assert 'description = foo' in data
|
||||
assert 'author = foo' in data
|
||||
@@ -1,7 +1,11 @@
|
||||
# remove when we don't support py38 anymore
|
||||
from __future__ import annotations
|
||||
import datetime
|
||||
|
||||
from jinja2 import Template
|
||||
|
||||
def test_page(page_template):
|
||||
|
||||
def test_page(page_template: Template) -> None:
|
||||
ctx = {
|
||||
'content': 'this is the content',
|
||||
'title': 'this is the title',
|
||||
@@ -11,17 +15,19 @@ def test_page(page_template):
|
||||
assert 'this is the title' in result
|
||||
|
||||
|
||||
def test_article(article_template):
|
||||
def test_article(article_template: Template) -> None:
|
||||
ctx = {
|
||||
'content': 'this is the content',
|
||||
'title': 'this is the title',
|
||||
'date': datetime.datetime(1980, 5, 9),
|
||||
}
|
||||
result = article_template.render(ctx)
|
||||
assert 'this is the content' in result
|
||||
assert 'this is the title' in result
|
||||
assert '1980-05-09' in result
|
||||
|
||||
|
||||
def test_archive(archive_template):
|
||||
def test_archive(archive_template: Template) -> None:
|
||||
entry = {
|
||||
'title': 'this is a title',
|
||||
'dst': 'https://example.com/link',
|
||||
@@ -39,7 +45,7 @@ def test_archive(archive_template):
|
||||
assert 'https://example.com/link' in result
|
||||
|
||||
|
||||
def test_tags(tags_template):
|
||||
def test_tags(tags_template: Template) -> None:
|
||||
tags = [('foo', 42)]
|
||||
ctx = {
|
||||
'tags': tags,
|
||||
@@ -52,7 +58,7 @@ def test_tags(tags_template):
|
||||
assert '42' in result
|
||||
|
||||
|
||||
def test_tag(tag_template):
|
||||
def test_tag(tag_template: Template) -> None:
|
||||
entry = {
|
||||
'title': 'this is a title',
|
||||
'dst': 'https://example.com/link',
|
||||
|
||||
@@ -1,5 +1,8 @@
|
||||
# remove when we don't support py38 anymore
|
||||
from __future__ import annotations
|
||||
|
||||
import blag
|
||||
|
||||
|
||||
def test_version():
|
||||
def test_version() -> None:
|
||||
assert isinstance(blag.__VERSION__, str)
|
||||
|
||||
Reference in New Issue
Block a user