mirror of
https://github.com/craigerl/aprsd.git
synced 2025-08-04 06:22:25 -04:00
Compare commits
No commits in common. "master" and "4.0.2" have entirely different histories.
2
.github/workflows/release_build.yml
vendored
2
.github/workflows/release_build.yml
vendored
@ -41,8 +41,8 @@ jobs:
|
|||||||
platforms: linux/amd64,linux/arm64
|
platforms: linux/amd64,linux/arm64
|
||||||
file: ./Dockerfile
|
file: ./Dockerfile
|
||||||
build-args: |
|
build-args: |
|
||||||
INSTALL_TYPE=pypi
|
|
||||||
VERSION=${{ inputs.aprsd_version }}
|
VERSION=${{ inputs.aprsd_version }}
|
||||||
|
BRANCH=${{ inputs.aprsd_version }}
|
||||||
BUILDX_QEMU_ENV=true
|
BUILDX_QEMU_ENV=true
|
||||||
push: true
|
push: true
|
||||||
tags: |
|
tags: |
|
||||||
|
6
.gitignore
vendored
6
.gitignore
vendored
@ -60,9 +60,3 @@ AUTHORS
|
|||||||
Makefile.venv
|
Makefile.venv
|
||||||
# Copilot
|
# Copilot
|
||||||
.DS_Store
|
.DS_Store
|
||||||
|
|
||||||
.python-version
|
|
||||||
.fleet
|
|
||||||
.vscode
|
|
||||||
.envrc
|
|
||||||
.doit.db
|
|
||||||
|
76
ChangeLog.md
76
ChangeLog.md
@ -4,41 +4,6 @@ All notable changes to this project will be documented in this file. Dates are d
|
|||||||
|
|
||||||
Generated by [`auto-changelog`](https://github.com/CookPete/auto-changelog).
|
Generated by [`auto-changelog`](https://github.com/CookPete/auto-changelog).
|
||||||
|
|
||||||
#### [4.1.2](https://github.com/craigerl/aprsd/compare/4.1.1...4.1.2)
|
|
||||||
|
|
||||||
> 6 March 2025
|
|
||||||
|
|
||||||
- Allow passing in a custom handler to setup_logging [`d262589`](https://github.com/craigerl/aprsd/commit/d2625893134f498748859da3b1684b04d456f790)
|
|
||||||
|
|
||||||
#### [4.1.1](https://github.com/craigerl/aprsd/compare/4.1.0...4.1.1)
|
|
||||||
|
|
||||||
> 5 March 2025
|
|
||||||
|
|
||||||
- Added new config to disable logging to console [`0fa5b07`](https://github.com/craigerl/aprsd/commit/0fa5b07d4bf4bc5d5aaad1de52b78058e472fe24)
|
|
||||||
- Added threads.service [`c1c89fd`](https://github.com/craigerl/aprsd/commit/c1c89fd2c2c69c5e6c5d29a736a7b89e3d45cfe2)
|
|
||||||
- Update requirements [`2b185ee`](https://github.com/craigerl/aprsd/commit/2b185ee1b84598c832d8a5d73753cb428854b932)
|
|
||||||
- Fixed some more ruff checks [`94ba915`](https://github.com/craigerl/aprsd/commit/94ba915ed44b11eaabc885e033669d67d8c341a5)
|
|
||||||
- 4.1.1 release [`7ed8028`](https://github.com/craigerl/aprsd/commit/7ed80283071c1ccebf1e3373727608edd0a56ee9)
|
|
||||||
|
|
||||||
#### [4.1.0](https://github.com/craigerl/aprsd/compare/4.0.2...4.1.0)
|
|
||||||
|
|
||||||
> 20 February 2025
|
|
||||||
|
|
||||||
- Added new PacketFilter mechanism [`#184`](https://github.com/craigerl/aprsd/pull/184)
|
|
||||||
- Update to build from pypi [`3b57e75`](https://github.com/craigerl/aprsd/commit/3b57e7597d77303ffc03b082370283bb2fea2838)
|
|
||||||
- Updated APRSIS driver [`1606585`](https://github.com/craigerl/aprsd/commit/1606585d41f69133192199d139b53344bb320fa9)
|
|
||||||
- Updated packet_list to allow infinit max store [`19c12e7`](https://github.com/craigerl/aprsd/commit/19c12e70f30a6f1f7d223a2f0fd3bf1182579fa4)
|
|
||||||
- Update StatsStore to use existing lock [`227ddbf`](https://github.com/craigerl/aprsd/commit/227ddbf148be2e14d4b4f27e48a4b091a98f15df)
|
|
||||||
- Try and stop chardet logging! [`101904c`](https://github.com/craigerl/aprsd/commit/101904ca77d816ae9e70bc7d22e6d8516fc3c5ce)
|
|
||||||
- Fixed some pep8 failures. [`e9e7e6b`](https://github.com/craigerl/aprsd/commit/e9e7e6b59f9f93f3f09142e56407bc87603a44cb)
|
|
||||||
- updated gitignore [`fd517b3`](https://github.com/craigerl/aprsd/commit/fd517b32188fdf15835a74fbd515ce417e7ef1f5)
|
|
||||||
- Remove sleep in main RX thread [`6cd7e99`](https://github.com/craigerl/aprsd/commit/6cd7e997139e8f2687bee753d9e0d2b22b1c42a3)
|
|
||||||
- Changed Objectstore log to debug [`361663e`](https://github.com/craigerl/aprsd/commit/361663e7d2cf43bd2fd53da0d8c5205bb848dbc2)
|
|
||||||
- fix for None packet in rx thread [`d82a81a`](https://github.com/craigerl/aprsd/commit/d82a81a2c3c1a7f50177a0a6435a555daeb858aa)
|
|
||||||
- Fix runaway KISS driver on failed connnection [`b6da0eb`](https://github.com/craigerl/aprsd/commit/b6da0ebb0d2f4d7078dbbf91d8c03715412d89ea)
|
|
||||||
- CONF.logging.enable_color option added [`06bdb34`](https://github.com/craigerl/aprsd/commit/06bdb34642640d91ea96e3c6e8d8b5a4b8230611)
|
|
||||||
- Update Changelog for 4.1.0 release [`a3cda9f`](https://github.com/craigerl/aprsd/commit/a3cda9f37d4c9b955b523f46b2eb8cf412a84407)
|
|
||||||
|
|
||||||
#### [4.0.2](https://github.com/craigerl/aprsd/compare/4.0.1...4.0.2)
|
#### [4.0.2](https://github.com/craigerl/aprsd/compare/4.0.1...4.0.2)
|
||||||
|
|
||||||
> 25 January 2025
|
> 25 January 2025
|
||||||
@ -51,7 +16,6 @@ Generated by [`auto-changelog`](https://github.com/CookPete/auto-changelog).
|
|||||||
- Added uv.lock [`2f26eb8`](https://github.com/craigerl/aprsd/commit/2f26eb86f44625547f72f7c3612494b1bc44bc99)
|
- Added uv.lock [`2f26eb8`](https://github.com/craigerl/aprsd/commit/2f26eb86f44625547f72f7c3612494b1bc44bc99)
|
||||||
- Fix the testing of fortune path [`3c4e200`](https://github.com/craigerl/aprsd/commit/3c4e200d700c24125479bb754b5f68bdf35b85a6)
|
- Fix the testing of fortune path [`3c4e200`](https://github.com/craigerl/aprsd/commit/3c4e200d700c24125479bb754b5f68bdf35b85a6)
|
||||||
- update the install from github in Dockerfile [`bea4815`](https://github.com/craigerl/aprsd/commit/bea481555bc1270ab371a22c69973d648e526d54)
|
- update the install from github in Dockerfile [`bea4815`](https://github.com/craigerl/aprsd/commit/bea481555bc1270ab371a22c69973d648e526d54)
|
||||||
- Prep for 4.0.2 [`000adef`](https://github.com/craigerl/aprsd/commit/000adef6d4f2792d33980d59d37f4b139e0c693c)
|
|
||||||
|
|
||||||
#### [4.0.1](https://github.com/craigerl/aprsd/compare/4.0.0...4.0.1)
|
#### [4.0.1](https://github.com/craigerl/aprsd/compare/4.0.0...4.0.1)
|
||||||
|
|
||||||
@ -60,33 +24,11 @@ Generated by [`auto-changelog`](https://github.com/CookPete/auto-changelog).
|
|||||||
- Update pyproject for README.rst -> md [`e080394`](https://github.com/craigerl/aprsd/commit/e08039431ebde92a162ab422c05391dc55d3d3fa)
|
- Update pyproject for README.rst -> md [`e080394`](https://github.com/craigerl/aprsd/commit/e08039431ebde92a162ab422c05391dc55d3d3fa)
|
||||||
- Updated Changelog [`24f5672`](https://github.com/craigerl/aprsd/commit/24f567224cf8ecdebd51f49804425565883acb94)
|
- Updated Changelog [`24f5672`](https://github.com/craigerl/aprsd/commit/24f567224cf8ecdebd51f49804425565883acb94)
|
||||||
|
|
||||||
### [4.0.0](https://github.com/craigerl/aprsd/compare/3.5.0...4.0.0)
|
### [4.0.0](https://github.com/craigerl/aprsd/compare/3.4.4...4.0.0)
|
||||||
|
|
||||||
> 24 January 2025
|
> 24 January 2025
|
||||||
|
|
||||||
- Migrate admin web out of aprsd. [`#183`](https://github.com/craigerl/aprsd/pull/183)
|
- Migrate admin web out of aprsd. [`#183`](https://github.com/craigerl/aprsd/pull/183)
|
||||||
- Enable packet stats for listen command in Docker [`e5d8796`](https://github.com/craigerl/aprsd/commit/e5d8796cda1a007aa868c760b96b50b364351519)
|
|
||||||
- Added activity to README [`cdd297c`](https://github.com/craigerl/aprsd/commit/cdd297c5bbc8b93f4739f5850a3e5971ce8baeba)
|
|
||||||
- Added star history to readme [`02e2940`](https://github.com/craigerl/aprsd/commit/02e29405ce2f8310e4f87f68498dfd6575c2e43b)
|
|
||||||
- removed pytest from README [`1cba31f`](https://github.com/craigerl/aprsd/commit/1cba31f0ac9bd5ee532721a909fc752f023f3b06)
|
|
||||||
- Updated Docker for using alpine and uv [`24db814`](https://github.com/craigerl/aprsd/commit/24db814c82c9bb6634566d7428603bf7a9ae37d1)
|
|
||||||
- Update the admin and setup.sh for container [`044ea4c`](https://github.com/craigerl/aprsd/commit/044ea4cc9a0059101851d6e722e986ee236833e8)
|
|
||||||
- added healthcheck.sh [`1054999`](https://github.com/craigerl/aprsd/commit/10549995686b08e4c166f780efdec5bdae496cab)
|
|
||||||
- updated healthcheck.sh [`dabb48c`](https://github.com/craigerl/aprsd/commit/dabb48c6f64062c1fed8f83a4f0b8ffba0c206a5)
|
|
||||||
- try making image for webchat [`ba8acdc`](https://github.com/craigerl/aprsd/commit/ba8acdc5849fc7b2d8a1ee11af6f5e317cf30f45)
|
|
||||||
- Added APRSD logo [`0ed648f`](https://github.com/craigerl/aprsd/commit/0ed648f8f8a961dbbd9e22bcebadcde525ee41ae)
|
|
||||||
- Added plugin and extension links [`447451c`](https://github.com/craigerl/aprsd/commit/447451c6c97e1f2d3d0bf580db21ecd176690258)
|
|
||||||
- reduced logo size 50% [`cf4a29f`](https://github.com/craigerl/aprsd/commit/cf4a29f0cb3ed366b21ec3120a189614e0955180)
|
|
||||||
- Updated README.md TOC [`375a5e5`](https://github.com/craigerl/aprsd/commit/375a5e5b34718cadc6ee8a51484fc91441440a61)
|
|
||||||
- chore: update AUTHORS [skip ci] [`c556f51`](https://github.com/craigerl/aprsd/commit/c556f5126f725904822a75427475d46986f8e9f3)
|
|
||||||
- Updated requirements [`4a7a902`](https://github.com/craigerl/aprsd/commit/4a7a902a337759a352560d4d92dc314b1726412a)
|
|
||||||
- Updated ChangeLog for 4.0.0 [`934ebd2`](https://github.com/craigerl/aprsd/commit/934ebd236d044625b911dd8ca45293f6c5680a68)
|
|
||||||
|
|
||||||
#### [3.5.0](https://github.com/craigerl/aprsd/compare/3.4.4...3.5.0)
|
|
||||||
|
|
||||||
> 10 January 2025
|
|
||||||
|
|
||||||
- Migrate admin web out of aprsd. [`c48ff8d`](https://github.com/craigerl/aprsd/commit/c48ff8dfd4bd4ce2f95b36e71dce13da5446a658)
|
|
||||||
- Remove webchat as a built in command. [`8f8887f`](https://github.com/craigerl/aprsd/commit/8f8887f0e496d960b0e71275893b75408a40fdb2)
|
- Remove webchat as a built in command. [`8f8887f`](https://github.com/craigerl/aprsd/commit/8f8887f0e496d960b0e71275893b75408a40fdb2)
|
||||||
- Remove email plugin [`0880a35`](https://github.com/craigerl/aprsd/commit/0880a356e6df1a0924cbf6e815e68cba5f5c6cf1)
|
- Remove email plugin [`0880a35`](https://github.com/craigerl/aprsd/commit/0880a356e6df1a0924cbf6e815e68cba5f5c6cf1)
|
||||||
- Fixed make clean [`ae28dbb`](https://github.com/craigerl/aprsd/commit/ae28dbb0e6bc216bf78c0bd9d7804f57b39091d1)
|
- Fixed make clean [`ae28dbb`](https://github.com/craigerl/aprsd/commit/ae28dbb0e6bc216bf78c0bd9d7804f57b39091d1)
|
||||||
@ -95,6 +37,7 @@ Generated by [`auto-changelog`](https://github.com/CookPete/auto-changelog).
|
|||||||
- Removed LocationPlugin from aprsd core [`3bba8a1`](https://github.com/craigerl/aprsd/commit/3bba8a19da88b0912064cea786bc9f8203038946)
|
- Removed LocationPlugin from aprsd core [`3bba8a1`](https://github.com/craigerl/aprsd/commit/3bba8a19da88b0912064cea786bc9f8203038946)
|
||||||
- Include haversine library [`bbdbb9a`](https://github.com/craigerl/aprsd/commit/bbdbb9aba189d536497ea3cd7d30911fe3d9d706)
|
- Include haversine library [`bbdbb9a`](https://github.com/craigerl/aprsd/commit/bbdbb9aba189d536497ea3cd7d30911fe3d9d706)
|
||||||
- Update Makefile [`caa4bb8`](https://github.com/craigerl/aprsd/commit/caa4bb8bd01cbd2e02024d75e1c8af97acf6c657)
|
- Update Makefile [`caa4bb8`](https://github.com/craigerl/aprsd/commit/caa4bb8bd01cbd2e02024d75e1c8af97acf6c657)
|
||||||
|
- Enable packet stats for listen command in Docker [`e5d8796`](https://github.com/craigerl/aprsd/commit/e5d8796cda1a007aa868c760b96b50b364351519)
|
||||||
- Added new KeepAliveCollector [`30d1eb5`](https://github.com/craigerl/aprsd/commit/30d1eb57dd249c609f5b092d8084c40cadda7bd9)
|
- Added new KeepAliveCollector [`30d1eb5`](https://github.com/craigerl/aprsd/commit/30d1eb57dd249c609f5b092d8084c40cadda7bd9)
|
||||||
- Changed to ruff [`72d068c`](https://github.com/craigerl/aprsd/commit/72d068c0b8944c8c9eed494fc23de8d7179ee09b)
|
- Changed to ruff [`72d068c`](https://github.com/craigerl/aprsd/commit/72d068c0b8944c8c9eed494fc23de8d7179ee09b)
|
||||||
- Changed README.rst -> README.md [`b1a830d`](https://github.com/craigerl/aprsd/commit/b1a830d54e9dec473074b34f9566f161bdec0030)
|
- Changed README.rst -> README.md [`b1a830d`](https://github.com/craigerl/aprsd/commit/b1a830d54e9dec473074b34f9566f161bdec0030)
|
||||||
@ -115,6 +58,21 @@ Generated by [`auto-changelog`](https://github.com/CookPete/auto-changelog).
|
|||||||
- Added .mailmap [`8d98546`](https://github.com/craigerl/aprsd/commit/8d9854605584fa35117af888fe219df610fb7cb4)
|
- Added .mailmap [`8d98546`](https://github.com/craigerl/aprsd/commit/8d9854605584fa35117af888fe219df610fb7cb4)
|
||||||
- updated tools in pre-commit [`e4f82d6`](https://github.com/craigerl/aprsd/commit/e4f82d6054d4d859023423bccdd5c402d7a83494)
|
- updated tools in pre-commit [`e4f82d6`](https://github.com/craigerl/aprsd/commit/e4f82d6054d4d859023423bccdd5c402d7a83494)
|
||||||
- some cleanup [`e332d7c`](https://github.com/craigerl/aprsd/commit/e332d7c9d046066e2686ea0522ae06b86d2f162d)
|
- some cleanup [`e332d7c`](https://github.com/craigerl/aprsd/commit/e332d7c9d046066e2686ea0522ae06b86d2f162d)
|
||||||
|
- Added activity to README [`cdd297c`](https://github.com/craigerl/aprsd/commit/cdd297c5bbc8b93f4739f5850a3e5971ce8baeba)
|
||||||
|
- Added star history to readme [`02e2940`](https://github.com/craigerl/aprsd/commit/02e29405ce2f8310e4f87f68498dfd6575c2e43b)
|
||||||
|
- removed pytest from README [`1cba31f`](https://github.com/craigerl/aprsd/commit/1cba31f0ac9bd5ee532721a909fc752f023f3b06)
|
||||||
|
- Updated Docker for using alpine and uv [`24db814`](https://github.com/craigerl/aprsd/commit/24db814c82c9bb6634566d7428603bf7a9ae37d1)
|
||||||
|
- Update the admin and setup.sh for container [`044ea4c`](https://github.com/craigerl/aprsd/commit/044ea4cc9a0059101851d6e722e986ee236833e8)
|
||||||
|
- added healthcheck.sh [`1054999`](https://github.com/craigerl/aprsd/commit/10549995686b08e4c166f780efdec5bdae496cab)
|
||||||
|
- updated healthcheck.sh [`dabb48c`](https://github.com/craigerl/aprsd/commit/dabb48c6f64062c1fed8f83a4f0b8ffba0c206a5)
|
||||||
|
- try making image for webchat [`ba8acdc`](https://github.com/craigerl/aprsd/commit/ba8acdc5849fc7b2d8a1ee11af6f5e317cf30f45)
|
||||||
|
- Added APRSD logo [`0ed648f`](https://github.com/craigerl/aprsd/commit/0ed648f8f8a961dbbd9e22bcebadcde525ee41ae)
|
||||||
|
- Added plugin and extension links [`447451c`](https://github.com/craigerl/aprsd/commit/447451c6c97e1f2d3d0bf580db21ecd176690258)
|
||||||
|
- reduced logo size 50% [`cf4a29f`](https://github.com/craigerl/aprsd/commit/cf4a29f0cb3ed366b21ec3120a189614e0955180)
|
||||||
|
- Updated README.md TOC [`375a5e5`](https://github.com/craigerl/aprsd/commit/375a5e5b34718cadc6ee8a51484fc91441440a61)
|
||||||
|
- chore: update AUTHORS [skip ci] [`c556f51`](https://github.com/craigerl/aprsd/commit/c556f5126f725904822a75427475d46986f8e9f3)
|
||||||
|
- Updated requirements [`4a7a902`](https://github.com/craigerl/aprsd/commit/4a7a902a337759a352560d4d92dc314b1726412a)
|
||||||
|
- Updated ChangeLog for 4.0.0 [`934ebd2`](https://github.com/craigerl/aprsd/commit/934ebd236d044625b911dd8ca45293f6c5680a68)
|
||||||
|
|
||||||
#### [3.4.4](https://github.com/craigerl/aprsd/compare/3.4.3...3.4.4)
|
#### [3.4.4](https://github.com/craigerl/aprsd/compare/3.4.3...3.4.4)
|
||||||
|
|
||||||
|
@ -13,35 +13,35 @@ from aprsd.utils import trace
|
|||||||
|
|
||||||
CONF = cfg.CONF
|
CONF = cfg.CONF
|
||||||
home = str(Path.home())
|
home = str(Path.home())
|
||||||
DEFAULT_CONFIG_DIR = f'{home}/.config/aprsd/'
|
DEFAULT_CONFIG_DIR = f"{home}/.config/aprsd/"
|
||||||
DEFAULT_SAVE_FILE = f'{home}/.config/aprsd/aprsd.p'
|
DEFAULT_SAVE_FILE = f"{home}/.config/aprsd/aprsd.p"
|
||||||
DEFAULT_CONFIG_FILE = f'{home}/.config/aprsd/aprsd.conf'
|
DEFAULT_CONFIG_FILE = f"{home}/.config/aprsd/aprsd.conf"
|
||||||
|
|
||||||
|
|
||||||
F = t.TypeVar('F', bound=t.Callable[..., t.Any])
|
F = t.TypeVar("F", bound=t.Callable[..., t.Any])
|
||||||
|
|
||||||
common_options = [
|
common_options = [
|
||||||
click.option(
|
click.option(
|
||||||
'--loglevel',
|
"--loglevel",
|
||||||
default='INFO',
|
default="INFO",
|
||||||
show_default=True,
|
show_default=True,
|
||||||
type=click.Choice(
|
type=click.Choice(
|
||||||
['CRITICAL', 'ERROR', 'WARNING', 'INFO', 'DEBUG'],
|
["CRITICAL", "ERROR", "WARNING", "INFO", "DEBUG"],
|
||||||
case_sensitive=False,
|
case_sensitive=False,
|
||||||
),
|
),
|
||||||
show_choices=True,
|
show_choices=True,
|
||||||
help='The log level to use for aprsd.log',
|
help="The log level to use for aprsd.log",
|
||||||
),
|
),
|
||||||
click.option(
|
click.option(
|
||||||
'-c',
|
"-c",
|
||||||
'--config',
|
"--config",
|
||||||
'config_file',
|
"config_file",
|
||||||
show_default=True,
|
show_default=True,
|
||||||
default=DEFAULT_CONFIG_FILE,
|
default=DEFAULT_CONFIG_FILE,
|
||||||
help='The aprsd config file to use for options.',
|
help="The aprsd config file to use for options.",
|
||||||
),
|
),
|
||||||
click.option(
|
click.option(
|
||||||
'--quiet',
|
"--quiet",
|
||||||
is_flag=True,
|
is_flag=True,
|
||||||
default=False,
|
default=False,
|
||||||
help="Don't log to stdout",
|
help="Don't log to stdout",
|
||||||
@ -59,7 +59,7 @@ class AliasedGroup(click.Group):
|
|||||||
"""
|
"""
|
||||||
|
|
||||||
def decorator(f):
|
def decorator(f):
|
||||||
aliases = kwargs.pop('aliases', [])
|
aliases = kwargs.pop("aliases", [])
|
||||||
cmd = click.decorators.command(*args, **kwargs)(f)
|
cmd = click.decorators.command(*args, **kwargs)(f)
|
||||||
self.add_command(cmd)
|
self.add_command(cmd)
|
||||||
for alias in aliases:
|
for alias in aliases:
|
||||||
@ -77,7 +77,7 @@ class AliasedGroup(click.Group):
|
|||||||
"""
|
"""
|
||||||
|
|
||||||
def decorator(f):
|
def decorator(f):
|
||||||
aliases = kwargs.pop('aliases', [])
|
aliases = kwargs.pop("aliases", [])
|
||||||
cmd = click.decorators.group(*args, **kwargs)(f)
|
cmd = click.decorators.group(*args, **kwargs)(f)
|
||||||
self.add_command(cmd)
|
self.add_command(cmd)
|
||||||
for alias in aliases:
|
for alias in aliases:
|
||||||
@ -101,37 +101,36 @@ def process_standard_options(f: F) -> F:
|
|||||||
ctx = args[0]
|
ctx = args[0]
|
||||||
ctx.ensure_object(dict)
|
ctx.ensure_object(dict)
|
||||||
config_file_found = True
|
config_file_found = True
|
||||||
if kwargs['config_file']:
|
if kwargs["config_file"]:
|
||||||
default_config_files = [kwargs['config_file']]
|
default_config_files = [kwargs["config_file"]]
|
||||||
else:
|
else:
|
||||||
default_config_files = None
|
default_config_files = None
|
||||||
|
|
||||||
try:
|
try:
|
||||||
CONF(
|
CONF(
|
||||||
[],
|
[],
|
||||||
project='aprsd',
|
project="aprsd",
|
||||||
version=aprsd.__version__,
|
version=aprsd.__version__,
|
||||||
default_config_files=default_config_files,
|
default_config_files=default_config_files,
|
||||||
)
|
)
|
||||||
except cfg.ConfigFilesNotFoundError:
|
except cfg.ConfigFilesNotFoundError:
|
||||||
config_file_found = False
|
config_file_found = False
|
||||||
ctx.obj['loglevel'] = kwargs['loglevel']
|
ctx.obj["loglevel"] = kwargs["loglevel"]
|
||||||
# ctx.obj["config_file"] = kwargs["config_file"]
|
# ctx.obj["config_file"] = kwargs["config_file"]
|
||||||
ctx.obj['quiet'] = kwargs['quiet']
|
ctx.obj["quiet"] = kwargs["quiet"]
|
||||||
log.setup_logging(
|
log.setup_logging(
|
||||||
ctx.obj['loglevel'],
|
ctx.obj["loglevel"],
|
||||||
ctx.obj['quiet'],
|
ctx.obj["quiet"],
|
||||||
)
|
)
|
||||||
if CONF.trace_enabled:
|
if CONF.trace_enabled:
|
||||||
trace.setup_tracing(['method', 'api'])
|
trace.setup_tracing(["method", "api"])
|
||||||
|
|
||||||
if not config_file_found:
|
if not config_file_found:
|
||||||
LOG = logging.getLogger('APRSD') # noqa: N806
|
LOG = logging.getLogger("APRSD") # noqa: N806
|
||||||
LOG.error("No config file found!! run 'aprsd sample-config'")
|
LOG.error("No config file found!! run 'aprsd sample-config'")
|
||||||
|
|
||||||
del kwargs['loglevel']
|
del kwargs["loglevel"]
|
||||||
del kwargs['config_file']
|
del kwargs["config_file"]
|
||||||
del kwargs['quiet']
|
del kwargs["quiet"]
|
||||||
return f(*args, **kwargs)
|
return f(*args, **kwargs)
|
||||||
|
|
||||||
return update_wrapper(t.cast(F, new_func), f)
|
return update_wrapper(t.cast(F, new_func), f)
|
||||||
@ -143,17 +142,17 @@ def process_standard_options_no_config(f: F) -> F:
|
|||||||
def new_func(*args, **kwargs):
|
def new_func(*args, **kwargs):
|
||||||
ctx = args[0]
|
ctx = args[0]
|
||||||
ctx.ensure_object(dict)
|
ctx.ensure_object(dict)
|
||||||
ctx.obj['loglevel'] = kwargs['loglevel']
|
ctx.obj["loglevel"] = kwargs["loglevel"]
|
||||||
ctx.obj['config_file'] = kwargs['config_file']
|
ctx.obj["config_file"] = kwargs["config_file"]
|
||||||
ctx.obj['quiet'] = kwargs['quiet']
|
ctx.obj["quiet"] = kwargs["quiet"]
|
||||||
log.setup_logging(
|
log.setup_logging(
|
||||||
ctx.obj['loglevel'],
|
ctx.obj["loglevel"],
|
||||||
ctx.obj['quiet'],
|
ctx.obj["quiet"],
|
||||||
)
|
)
|
||||||
|
|
||||||
del kwargs['loglevel']
|
del kwargs["loglevel"]
|
||||||
del kwargs['config_file']
|
del kwargs["config_file"]
|
||||||
del kwargs['quiet']
|
del kwargs["quiet"]
|
||||||
return f(*args, **kwargs)
|
return f(*args, **kwargs)
|
||||||
|
|
||||||
return update_wrapper(t.cast(F, new_func), f)
|
return update_wrapper(t.cast(F, new_func), f)
|
||||||
|
@ -1,5 +1,13 @@
|
|||||||
# define the client transports here
|
from aprsd.client import aprsis, factory, fake, kiss
|
||||||
TRANSPORT_APRSIS = 'aprsis'
|
|
||||||
TRANSPORT_TCPKISS = 'tcpkiss'
|
|
||||||
TRANSPORT_SERIALKISS = 'serialkiss'
|
TRANSPORT_APRSIS = "aprsis"
|
||||||
TRANSPORT_FAKE = 'fake'
|
TRANSPORT_TCPKISS = "tcpkiss"
|
||||||
|
TRANSPORT_SERIALKISS = "serialkiss"
|
||||||
|
TRANSPORT_FAKE = "fake"
|
||||||
|
|
||||||
|
|
||||||
|
client_factory = factory.ClientFactory()
|
||||||
|
client_factory.register(aprsis.APRSISClient)
|
||||||
|
client_factory.register(kiss.KISSClient)
|
||||||
|
client_factory.register(fake.APRSDFakeClient)
|
||||||
|
183
aprsd/client/aprsis.py
Normal file
183
aprsd/client/aprsis.py
Normal file
@ -0,0 +1,183 @@
|
|||||||
|
import datetime
|
||||||
|
import logging
|
||||||
|
import time
|
||||||
|
|
||||||
|
import timeago
|
||||||
|
from aprslib.exceptions import LoginError
|
||||||
|
from loguru import logger
|
||||||
|
from oslo_config import cfg
|
||||||
|
|
||||||
|
from aprsd import client, exception
|
||||||
|
from aprsd.client import base
|
||||||
|
from aprsd.client.drivers import aprsis
|
||||||
|
from aprsd.packets import core
|
||||||
|
|
||||||
|
CONF = cfg.CONF
|
||||||
|
LOG = logging.getLogger("APRSD")
|
||||||
|
LOGU = logger
|
||||||
|
|
||||||
|
|
||||||
|
class APRSISClient(base.APRSClient):
|
||||||
|
_client = None
|
||||||
|
_checks = False
|
||||||
|
|
||||||
|
def __init__(self):
|
||||||
|
max_timeout = {"hours": 0.0, "minutes": 2, "seconds": 0}
|
||||||
|
self.max_delta = datetime.timedelta(**max_timeout)
|
||||||
|
|
||||||
|
def stats(self, serializable=False) -> dict:
|
||||||
|
stats = {}
|
||||||
|
if self.is_configured():
|
||||||
|
if self._client:
|
||||||
|
keepalive = self._client.aprsd_keepalive
|
||||||
|
server_string = self._client.server_string
|
||||||
|
if serializable:
|
||||||
|
keepalive = keepalive.isoformat()
|
||||||
|
else:
|
||||||
|
keepalive = "None"
|
||||||
|
server_string = "None"
|
||||||
|
stats = {
|
||||||
|
"connected": self.is_connected,
|
||||||
|
"filter": self.filter,
|
||||||
|
"login_status": self.login_status,
|
||||||
|
"connection_keepalive": keepalive,
|
||||||
|
"server_string": server_string,
|
||||||
|
"transport": self.transport(),
|
||||||
|
}
|
||||||
|
|
||||||
|
return stats
|
||||||
|
|
||||||
|
def keepalive_check(self):
|
||||||
|
# Don't check the first time through.
|
||||||
|
if not self.is_alive() and self._checks:
|
||||||
|
LOG.warning("Resetting client. It's not alive.")
|
||||||
|
self.reset()
|
||||||
|
self._checks = True
|
||||||
|
|
||||||
|
def keepalive_log(self):
|
||||||
|
if ka := self._client.aprsd_keepalive:
|
||||||
|
keepalive = timeago.format(ka)
|
||||||
|
else:
|
||||||
|
keepalive = "N/A"
|
||||||
|
LOGU.opt(colors=True).info(f"<green>Client keepalive {keepalive}</green>")
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def is_enabled():
|
||||||
|
# Defaults to True if the enabled flag is non existent
|
||||||
|
try:
|
||||||
|
return CONF.aprs_network.enabled
|
||||||
|
except KeyError:
|
||||||
|
return False
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def is_configured():
|
||||||
|
if APRSISClient.is_enabled():
|
||||||
|
# Ensure that the config vars are correctly set
|
||||||
|
if not CONF.aprs_network.login:
|
||||||
|
LOG.error("Config aprs_network.login not set.")
|
||||||
|
raise exception.MissingConfigOptionException(
|
||||||
|
"aprs_network.login is not set.",
|
||||||
|
)
|
||||||
|
if not CONF.aprs_network.password:
|
||||||
|
LOG.error("Config aprs_network.password not set.")
|
||||||
|
raise exception.MissingConfigOptionException(
|
||||||
|
"aprs_network.password is not set.",
|
||||||
|
)
|
||||||
|
if not CONF.aprs_network.host:
|
||||||
|
LOG.error("Config aprs_network.host not set.")
|
||||||
|
raise exception.MissingConfigOptionException(
|
||||||
|
"aprs_network.host is not set.",
|
||||||
|
)
|
||||||
|
|
||||||
|
return True
|
||||||
|
return True
|
||||||
|
|
||||||
|
def _is_stale_connection(self):
|
||||||
|
delta = datetime.datetime.now() - self._client.aprsd_keepalive
|
||||||
|
if delta > self.max_delta:
|
||||||
|
LOG.error(f"Connection is stale, last heard {delta} ago.")
|
||||||
|
return True
|
||||||
|
return False
|
||||||
|
|
||||||
|
def is_alive(self):
|
||||||
|
if not self._client:
|
||||||
|
LOG.warning(f"APRS_CLIENT {self._client} alive? NO!!!")
|
||||||
|
return False
|
||||||
|
return self._client.is_alive() and not self._is_stale_connection()
|
||||||
|
|
||||||
|
def close(self):
|
||||||
|
if self._client:
|
||||||
|
self._client.stop()
|
||||||
|
self._client.close()
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def transport():
|
||||||
|
return client.TRANSPORT_APRSIS
|
||||||
|
|
||||||
|
def decode_packet(self, *args, **kwargs):
|
||||||
|
"""APRS lib already decodes this."""
|
||||||
|
return core.factory(args[0])
|
||||||
|
|
||||||
|
def setup_connection(self):
|
||||||
|
user = CONF.aprs_network.login
|
||||||
|
password = CONF.aprs_network.password
|
||||||
|
host = CONF.aprs_network.host
|
||||||
|
port = CONF.aprs_network.port
|
||||||
|
self.connected = False
|
||||||
|
backoff = 1
|
||||||
|
aprs_client = None
|
||||||
|
retries = 3
|
||||||
|
retry_count = 0
|
||||||
|
while not self.connected:
|
||||||
|
retry_count += 1
|
||||||
|
if retry_count >= retries:
|
||||||
|
break
|
||||||
|
try:
|
||||||
|
LOG.info(
|
||||||
|
f"Creating aprslib client({host}:{port}) and logging in {user}."
|
||||||
|
)
|
||||||
|
aprs_client = aprsis.Aprsdis(
|
||||||
|
user, passwd=password, host=host, port=port
|
||||||
|
)
|
||||||
|
# Force the log to be the same
|
||||||
|
aprs_client.logger = LOG
|
||||||
|
aprs_client.connect()
|
||||||
|
self.connected = self.login_status["success"] = True
|
||||||
|
self.login_status["message"] = aprs_client.server_string
|
||||||
|
backoff = 1
|
||||||
|
except LoginError as e:
|
||||||
|
LOG.error(f"Failed to login to APRS-IS Server '{e}'")
|
||||||
|
self.connected = self.login_status["success"] = False
|
||||||
|
self.login_status["message"] = e.message
|
||||||
|
LOG.error(e.message)
|
||||||
|
time.sleep(backoff)
|
||||||
|
except Exception as e:
|
||||||
|
LOG.error(f"Unable to connect to APRS-IS server. '{e}' ")
|
||||||
|
self.connected = self.login_status["success"] = False
|
||||||
|
self.login_status["message"] = e.message
|
||||||
|
time.sleep(backoff)
|
||||||
|
# Don't allow the backoff to go to inifinity.
|
||||||
|
if backoff > 5:
|
||||||
|
backoff = 5
|
||||||
|
else:
|
||||||
|
backoff += 1
|
||||||
|
continue
|
||||||
|
self._client = aprs_client
|
||||||
|
return aprs_client
|
||||||
|
|
||||||
|
def consumer(self, callback, blocking=False, immortal=False, raw=False):
|
||||||
|
if self._client:
|
||||||
|
try:
|
||||||
|
self._client.consumer(
|
||||||
|
callback,
|
||||||
|
blocking=blocking,
|
||||||
|
immortal=immortal,
|
||||||
|
raw=raw,
|
||||||
|
)
|
||||||
|
except Exception as e:
|
||||||
|
LOG.error(e)
|
||||||
|
LOG.info(e.__cause__)
|
||||||
|
raise e
|
||||||
|
else:
|
||||||
|
LOG.warning("client is None, might be resetting.")
|
||||||
|
self.connected = False
|
153
aprsd/client/base.py
Normal file
153
aprsd/client/base.py
Normal file
@ -0,0 +1,153 @@
|
|||||||
|
import abc
|
||||||
|
import logging
|
||||||
|
import threading
|
||||||
|
|
||||||
|
import wrapt
|
||||||
|
from oslo_config import cfg
|
||||||
|
|
||||||
|
from aprsd.packets import core
|
||||||
|
from aprsd.utils import keepalive_collector
|
||||||
|
|
||||||
|
CONF = cfg.CONF
|
||||||
|
LOG = logging.getLogger("APRSD")
|
||||||
|
|
||||||
|
|
||||||
|
class APRSClient:
|
||||||
|
"""Singleton client class that constructs the aprslib connection."""
|
||||||
|
|
||||||
|
_instance = None
|
||||||
|
_client = None
|
||||||
|
|
||||||
|
connected = False
|
||||||
|
login_status = {
|
||||||
|
"success": False,
|
||||||
|
"message": None,
|
||||||
|
}
|
||||||
|
filter = None
|
||||||
|
lock = threading.Lock()
|
||||||
|
|
||||||
|
def __new__(cls, *args, **kwargs):
|
||||||
|
"""This magic turns this into a singleton."""
|
||||||
|
if cls._instance is None:
|
||||||
|
cls._instance = super().__new__(cls)
|
||||||
|
keepalive_collector.KeepAliveCollector().register(cls)
|
||||||
|
# Put any initialization here.
|
||||||
|
cls._instance._create_client()
|
||||||
|
return cls._instance
|
||||||
|
|
||||||
|
@abc.abstractmethod
|
||||||
|
def stats(self) -> dict:
|
||||||
|
"""Return statistics about the client connection.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
dict: Statistics about the connection and packet handling
|
||||||
|
"""
|
||||||
|
|
||||||
|
@abc.abstractmethod
|
||||||
|
def keepalive_check(self) -> None:
|
||||||
|
"""Called during keepalive run to check status."""
|
||||||
|
...
|
||||||
|
|
||||||
|
@abc.abstractmethod
|
||||||
|
def keepalive_log(self) -> None:
|
||||||
|
"""Log any keepalive information."""
|
||||||
|
...
|
||||||
|
|
||||||
|
@property
|
||||||
|
def is_connected(self):
|
||||||
|
return self.connected
|
||||||
|
|
||||||
|
@property
|
||||||
|
def login_success(self):
|
||||||
|
return self.login_status.get("success", False)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def login_failure(self):
|
||||||
|
return self.login_status["message"]
|
||||||
|
|
||||||
|
def set_filter(self, filter):
|
||||||
|
self.filter = filter
|
||||||
|
if self._client:
|
||||||
|
self._client.set_filter(filter)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def client(self):
|
||||||
|
if not self._client:
|
||||||
|
self._create_client()
|
||||||
|
return self._client
|
||||||
|
|
||||||
|
def _create_client(self):
|
||||||
|
try:
|
||||||
|
self._client = self.setup_connection()
|
||||||
|
if self.filter:
|
||||||
|
LOG.info("Creating APRS client filter")
|
||||||
|
self._client.set_filter(self.filter)
|
||||||
|
except Exception as e:
|
||||||
|
LOG.error(f"Failed to create APRS client: {e}")
|
||||||
|
self._client = None
|
||||||
|
raise
|
||||||
|
|
||||||
|
def stop(self):
|
||||||
|
if self._client:
|
||||||
|
LOG.info("Stopping client connection.")
|
||||||
|
self._client.stop()
|
||||||
|
|
||||||
|
def send(self, packet: core.Packet) -> None:
|
||||||
|
"""Send a packet to the network.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
packet: The APRS packet to send
|
||||||
|
"""
|
||||||
|
self.client.send(packet)
|
||||||
|
|
||||||
|
@wrapt.synchronized(lock)
|
||||||
|
def reset(self) -> None:
|
||||||
|
"""Call this to force a rebuild/reconnect."""
|
||||||
|
LOG.info("Resetting client connection.")
|
||||||
|
if self._client:
|
||||||
|
self._client.close()
|
||||||
|
del self._client
|
||||||
|
self._create_client()
|
||||||
|
else:
|
||||||
|
LOG.warning("Client not initialized, nothing to reset.")
|
||||||
|
|
||||||
|
# Recreate the client
|
||||||
|
LOG.info(f"Creating new client {self.client}")
|
||||||
|
|
||||||
|
@abc.abstractmethod
|
||||||
|
def setup_connection(self):
|
||||||
|
"""Initialize and return the underlying APRS connection.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
object: The initialized connection object
|
||||||
|
"""
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
@abc.abstractmethod
|
||||||
|
def is_enabled():
|
||||||
|
pass
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
@abc.abstractmethod
|
||||||
|
def transport():
|
||||||
|
pass
|
||||||
|
|
||||||
|
@abc.abstractmethod
|
||||||
|
def decode_packet(self, *args, **kwargs):
|
||||||
|
"""Decode raw APRS packet data into a Packet object.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Packet: Decoded APRS packet
|
||||||
|
"""
|
||||||
|
|
||||||
|
@abc.abstractmethod
|
||||||
|
def consumer(self, callback, blocking=False, immortal=False, raw=False):
|
||||||
|
pass
|
||||||
|
|
||||||
|
@abc.abstractmethod
|
||||||
|
def is_alive(self):
|
||||||
|
pass
|
||||||
|
|
||||||
|
@abc.abstractmethod
|
||||||
|
def close(self):
|
||||||
|
pass
|
@ -1,141 +0,0 @@
|
|||||||
import logging
|
|
||||||
import threading
|
|
||||||
from typing import Callable
|
|
||||||
|
|
||||||
import timeago
|
|
||||||
import wrapt
|
|
||||||
from loguru import logger
|
|
||||||
from oslo_config import cfg
|
|
||||||
|
|
||||||
from aprsd.client import drivers # noqa - ensure drivers are registered
|
|
||||||
from aprsd.client.drivers.registry import DriverRegistry
|
|
||||||
from aprsd.packets import core
|
|
||||||
from aprsd.utils import keepalive_collector
|
|
||||||
|
|
||||||
CONF = cfg.CONF
|
|
||||||
LOG = logging.getLogger('APRSD')
|
|
||||||
LOGU = logger
|
|
||||||
|
|
||||||
|
|
||||||
class APRSDClient:
|
|
||||||
"""APRSD client class.
|
|
||||||
|
|
||||||
This is a singleton class that provides a single instance of the APRSD client.
|
|
||||||
It is responsible for connecting to the appropriate APRSD client driver based on
|
|
||||||
the configuration.
|
|
||||||
|
|
||||||
"""
|
|
||||||
|
|
||||||
_instance = None
|
|
||||||
driver = None
|
|
||||||
lock = threading.Lock()
|
|
||||||
filter = None
|
|
||||||
|
|
||||||
def __new__(cls, *args, **kwargs):
|
|
||||||
"""This magic turns this into a singleton."""
|
|
||||||
if cls._instance is None:
|
|
||||||
cls._instance = super().__new__(cls)
|
|
||||||
keepalive_collector.KeepAliveCollector().register(cls)
|
|
||||||
return cls._instance
|
|
||||||
|
|
||||||
def __init__(self):
|
|
||||||
self.connected = False
|
|
||||||
self.login_status = {
|
|
||||||
'success': False,
|
|
||||||
'message': None,
|
|
||||||
}
|
|
||||||
if not self.driver:
|
|
||||||
self.driver = DriverRegistry().get_driver()
|
|
||||||
self.driver.setup_connection()
|
|
||||||
|
|
||||||
def stats(self, serializable=False) -> dict:
|
|
||||||
stats = {}
|
|
||||||
if self.driver:
|
|
||||||
stats = self.driver.stats(serializable=serializable)
|
|
||||||
return stats
|
|
||||||
|
|
||||||
@property
|
|
||||||
def is_enabled(self):
|
|
||||||
if not self.driver:
|
|
||||||
return False
|
|
||||||
return self.driver.is_enabled()
|
|
||||||
|
|
||||||
@property
|
|
||||||
def is_configured(self):
|
|
||||||
if not self.driver:
|
|
||||||
return False
|
|
||||||
return self.driver.is_configured()
|
|
||||||
|
|
||||||
# @property
|
|
||||||
# def is_connected(self):
|
|
||||||
# if not self.driver:
|
|
||||||
# return False
|
|
||||||
# return self.driver.is_connected()
|
|
||||||
|
|
||||||
@property
|
|
||||||
def login_success(self):
|
|
||||||
if not self.driver:
|
|
||||||
return False
|
|
||||||
return self.driver.login_success
|
|
||||||
|
|
||||||
@property
|
|
||||||
def login_failure(self):
|
|
||||||
if not self.driver:
|
|
||||||
return None
|
|
||||||
return self.driver.login_failure
|
|
||||||
|
|
||||||
def set_filter(self, filter):
|
|
||||||
self.filter = filter
|
|
||||||
if not self.driver:
|
|
||||||
return
|
|
||||||
self.driver.set_filter(filter)
|
|
||||||
|
|
||||||
def get_filter(self):
|
|
||||||
if not self.driver:
|
|
||||||
return None
|
|
||||||
return self.driver.filter
|
|
||||||
|
|
||||||
def is_alive(self):
|
|
||||||
return self.driver.is_alive()
|
|
||||||
|
|
||||||
def close(self):
|
|
||||||
if not self.driver:
|
|
||||||
return
|
|
||||||
self.driver.close()
|
|
||||||
|
|
||||||
@wrapt.synchronized(lock)
|
|
||||||
def reset(self):
|
|
||||||
"""Call this to force a rebuild/reconnect."""
|
|
||||||
LOG.info('Resetting client connection.')
|
|
||||||
if self.driver:
|
|
||||||
self.driver.close()
|
|
||||||
self.driver.setup_connection()
|
|
||||||
if self.filter:
|
|
||||||
self.driver.set_filter(self.filter)
|
|
||||||
else:
|
|
||||||
LOG.warning('Client not initialized, nothing to reset.')
|
|
||||||
|
|
||||||
def send(self, packet: core.Packet) -> bool:
|
|
||||||
return self.driver.send(packet)
|
|
||||||
|
|
||||||
# For the keepalive collector
|
|
||||||
def keepalive_check(self):
|
|
||||||
# Don't check the first time through.
|
|
||||||
if not self.driver.is_alive and self._checks:
|
|
||||||
LOG.warning("Resetting client. It's not alive.")
|
|
||||||
self.reset()
|
|
||||||
self._checks = True
|
|
||||||
|
|
||||||
# For the keepalive collector
|
|
||||||
def keepalive_log(self):
|
|
||||||
if ka := self.driver.keepalive:
|
|
||||||
keepalive = timeago.format(ka)
|
|
||||||
else:
|
|
||||||
keepalive = 'N/A'
|
|
||||||
LOGU.opt(colors=True).info(f'<green>Client keepalive {keepalive}</green>')
|
|
||||||
|
|
||||||
def consumer(self, callback: Callable, raw: bool = False):
|
|
||||||
return self.driver.consumer(callback=callback, raw=raw)
|
|
||||||
|
|
||||||
def decode_packet(self, *args, **kwargs) -> core.Packet:
|
|
||||||
return self.driver.decode_packet(*args, **kwargs)
|
|
@ -1,10 +0,0 @@
|
|||||||
# All client drivers must be registered here
|
|
||||||
from aprsd.client.drivers.aprsis import APRSISDriver
|
|
||||||
from aprsd.client.drivers.fake import APRSDFakeDriver
|
|
||||||
from aprsd.client.drivers.registry import DriverRegistry
|
|
||||||
from aprsd.client.drivers.tcpkiss import TCPKISSDriver
|
|
||||||
|
|
||||||
driver_registry = DriverRegistry()
|
|
||||||
driver_registry.register(APRSDFakeDriver)
|
|
||||||
driver_registry.register(APRSISDriver)
|
|
||||||
driver_registry.register(TCPKISSDriver)
|
|
@ -1,205 +1,234 @@
|
|||||||
import datetime
|
import datetime
|
||||||
import logging
|
import logging
|
||||||
import time
|
import select
|
||||||
from typing import Callable
|
import threading
|
||||||
|
|
||||||
from aprslib.exceptions import LoginError
|
import aprslib
|
||||||
from loguru import logger
|
import wrapt
|
||||||
from oslo_config import cfg
|
from aprslib import is_py3
|
||||||
|
from aprslib.exceptions import (
|
||||||
|
ConnectionDrop,
|
||||||
|
ConnectionError,
|
||||||
|
GenericError,
|
||||||
|
LoginError,
|
||||||
|
ParseError,
|
||||||
|
UnknownFormat,
|
||||||
|
)
|
||||||
|
|
||||||
from aprsd import client, exception
|
import aprsd
|
||||||
from aprsd.client.drivers.lib.aprslib import APRSLibClient
|
|
||||||
from aprsd.packets import core
|
from aprsd.packets import core
|
||||||
|
|
||||||
CONF = cfg.CONF
|
LOG = logging.getLogger("APRSD")
|
||||||
LOG = logging.getLogger('APRSD')
|
|
||||||
LOGU = logger
|
|
||||||
|
|
||||||
|
|
||||||
# class APRSISDriver(metaclass=trace.TraceWrapperMetaclass):
|
class Aprsdis(aprslib.IS):
|
||||||
class APRSISDriver:
|
"""Extend the aprslib class so we can exit properly."""
|
||||||
"""This is the APRS-IS driver for the APRSD client.
|
|
||||||
|
|
||||||
This driver uses our modified aprslib.IS class to connect to the APRS-IS server.
|
# flag to tell us to stop
|
||||||
|
thread_stop = False
|
||||||
|
|
||||||
"""
|
# date for last time we heard from the server
|
||||||
|
aprsd_keepalive = datetime.datetime.now()
|
||||||
|
|
||||||
_client = None
|
# Which server we are connected to?
|
||||||
_checks = False
|
server_string = "None"
|
||||||
|
|
||||||
def __init__(self):
|
# timeout in seconds
|
||||||
max_timeout = {'hours': 0.0, 'minutes': 2, 'seconds': 0}
|
select_timeout = 1
|
||||||
self.max_delta = datetime.timedelta(**max_timeout)
|
lock = threading.Lock()
|
||||||
self.login_status = {
|
|
||||||
'success': False,
|
|
||||||
'message': None,
|
|
||||||
}
|
|
||||||
|
|
||||||
@staticmethod
|
def stop(self):
|
||||||
def is_enabled():
|
self.thread_stop = True
|
||||||
# Defaults to True if the enabled flag is non existent
|
LOG.warning("Shutdown Aprsdis client.")
|
||||||
try:
|
|
||||||
return CONF.aprs_network.enabled
|
|
||||||
except KeyError:
|
|
||||||
return False
|
|
||||||
|
|
||||||
@staticmethod
|
|
||||||
def is_configured():
|
|
||||||
if APRSISDriver.is_enabled():
|
|
||||||
# Ensure that the config vars are correctly set
|
|
||||||
if not CONF.aprs_network.login:
|
|
||||||
LOG.error('Config aprs_network.login not set.')
|
|
||||||
raise exception.MissingConfigOptionException(
|
|
||||||
'aprs_network.login is not set.',
|
|
||||||
)
|
|
||||||
if not CONF.aprs_network.password:
|
|
||||||
LOG.error('Config aprs_network.password not set.')
|
|
||||||
raise exception.MissingConfigOptionException(
|
|
||||||
'aprs_network.password is not set.',
|
|
||||||
)
|
|
||||||
if not CONF.aprs_network.host:
|
|
||||||
LOG.error('Config aprs_network.host not set.')
|
|
||||||
raise exception.MissingConfigOptionException(
|
|
||||||
'aprs_network.host is not set.',
|
|
||||||
)
|
|
||||||
|
|
||||||
return True
|
|
||||||
return True
|
|
||||||
|
|
||||||
@property
|
|
||||||
def is_alive(self):
|
|
||||||
if not self._client:
|
|
||||||
LOG.warning(f'APRS_CLIENT {self._client} alive? NO!!!')
|
|
||||||
return False
|
|
||||||
return self._client.is_alive() and not self._is_stale_connection()
|
|
||||||
|
|
||||||
def close(self):
|
def close(self):
|
||||||
if self._client:
|
LOG.warning("Closing Aprsdis client.")
|
||||||
self._client.stop()
|
super().close()
|
||||||
self._client.close()
|
|
||||||
|
|
||||||
def send(self, packet: core.Packet) -> bool:
|
@wrapt.synchronized(lock)
|
||||||
return self._client.send(packet)
|
def send(self, packet: core.Packet):
|
||||||
|
"""Send an APRS Message object."""
|
||||||
|
self.sendall(packet.raw)
|
||||||
|
|
||||||
def setup_connection(self):
|
def is_alive(self):
|
||||||
user = CONF.aprs_network.login
|
"""If the connection is alive or not."""
|
||||||
password = CONF.aprs_network.password
|
return self._connected
|
||||||
host = CONF.aprs_network.host
|
|
||||||
port = CONF.aprs_network.port
|
def _socket_readlines(self, blocking=False):
|
||||||
self.connected = False
|
"""
|
||||||
backoff = 1
|
Generator for complete lines, received from the server
|
||||||
retries = 3
|
"""
|
||||||
retry_count = 0
|
try:
|
||||||
while not self.connected:
|
self.sock.setblocking(0)
|
||||||
retry_count += 1
|
except OSError as e:
|
||||||
if retry_count >= retries:
|
self.logger.error(f"socket error when setblocking(0): {str(e)}")
|
||||||
break
|
raise aprslib.ConnectionDrop("connection dropped")
|
||||||
try:
|
|
||||||
LOG.info(
|
while not self.thread_stop:
|
||||||
f'Creating aprslib client({host}:{port}) and logging in {user}.'
|
short_buf = b""
|
||||||
)
|
newline = b"\r\n"
|
||||||
self._client = APRSLibClient(
|
|
||||||
user, passwd=password, host=host, port=port
|
# set a select timeout, so we get a chance to exit
|
||||||
)
|
# when user hits CTRL-C
|
||||||
# Force the log to be the same
|
readable, writable, exceptional = select.select(
|
||||||
self._client.logger = LOG
|
[self.sock],
|
||||||
self._client.connect()
|
[],
|
||||||
self.connected = self.login_status['success'] = True
|
[],
|
||||||
self.login_status['message'] = self._client.server_string
|
self.select_timeout,
|
||||||
backoff = 1
|
)
|
||||||
except LoginError as e:
|
if not readable:
|
||||||
LOG.error(f"Failed to login to APRS-IS Server '{e}'")
|
if not blocking:
|
||||||
self.connected = self.login_status['success'] = False
|
break
|
||||||
self.login_status['message'] = (
|
|
||||||
e.message if hasattr(e, 'message') else str(e)
|
|
||||||
)
|
|
||||||
LOG.error(self.login_status['message'])
|
|
||||||
time.sleep(backoff)
|
|
||||||
except Exception as e:
|
|
||||||
LOG.error(f"Unable to connect to APRS-IS server. '{e}' ")
|
|
||||||
self.connected = self.login_status['success'] = False
|
|
||||||
self.login_status['message'] = getattr(e, 'message', str(e))
|
|
||||||
time.sleep(backoff)
|
|
||||||
# Don't allow the backoff to go to inifinity.
|
|
||||||
if backoff > 5:
|
|
||||||
backoff = 5
|
|
||||||
else:
|
else:
|
||||||
backoff += 1
|
continue
|
||||||
continue
|
|
||||||
|
|
||||||
def set_filter(self, filter):
|
|
||||||
self._client.set_filter(filter)
|
|
||||||
|
|
||||||
def login_success(self) -> bool:
|
|
||||||
return self.login_status.get('success', False)
|
|
||||||
|
|
||||||
def login_failure(self) -> str:
|
|
||||||
return self.login_status.get('message', None)
|
|
||||||
|
|
||||||
@property
|
|
||||||
def filter(self):
|
|
||||||
return self._client.filter
|
|
||||||
|
|
||||||
@property
|
|
||||||
def server_string(self):
|
|
||||||
return self._client.server_string
|
|
||||||
|
|
||||||
@property
|
|
||||||
def keepalive(self):
|
|
||||||
return self._client.aprsd_keepalive
|
|
||||||
|
|
||||||
def _is_stale_connection(self):
|
|
||||||
delta = datetime.datetime.now() - self._client.aprsd_keepalive
|
|
||||||
if delta > self.max_delta:
|
|
||||||
LOG.error(f'Connection is stale, last heard {delta} ago.')
|
|
||||||
return True
|
|
||||||
return False
|
|
||||||
|
|
||||||
@staticmethod
|
|
||||||
def transport():
|
|
||||||
return client.TRANSPORT_APRSIS
|
|
||||||
|
|
||||||
def decode_packet(self, *args, **kwargs):
|
|
||||||
"""APRS lib already decodes this."""
|
|
||||||
return core.factory(args[0])
|
|
||||||
|
|
||||||
def consumer(self, callback: Callable, raw: bool = False):
|
|
||||||
if self._client:
|
|
||||||
try:
|
try:
|
||||||
self._client.consumer(
|
short_buf = self.sock.recv(4096)
|
||||||
callback,
|
|
||||||
blocking=False,
|
|
||||||
immortal=False,
|
|
||||||
raw=raw,
|
|
||||||
)
|
|
||||||
except Exception as e:
|
|
||||||
LOG.error(e)
|
|
||||||
LOG.info(e.__cause__)
|
|
||||||
raise e
|
|
||||||
else:
|
|
||||||
LOG.warning('client is None, might be resetting.')
|
|
||||||
self.connected = False
|
|
||||||
|
|
||||||
def stats(self, serializable=False) -> dict:
|
# sock.recv returns empty if the connection drops
|
||||||
stats = {}
|
if not short_buf:
|
||||||
if self.is_configured():
|
if not blocking:
|
||||||
if self._client:
|
# We could just not be blocking, so empty is expected
|
||||||
keepalive = self._client.aprsd_keepalive
|
continue
|
||||||
server_string = self._client.server_string
|
else:
|
||||||
if serializable:
|
self.logger.error("socket.recv(): returned empty")
|
||||||
keepalive = keepalive.isoformat()
|
raise aprslib.ConnectionDrop("connection dropped")
|
||||||
filter = self.filter
|
except OSError as e:
|
||||||
|
# self.logger.error("socket error on recv(): %s" % str(e))
|
||||||
|
if "Resource temporarily unavailable" in str(e):
|
||||||
|
if not blocking:
|
||||||
|
if len(self.buf) == 0:
|
||||||
|
break
|
||||||
|
|
||||||
|
self.buf += short_buf
|
||||||
|
|
||||||
|
while newline in self.buf:
|
||||||
|
line, self.buf = self.buf.split(newline, 1)
|
||||||
|
|
||||||
|
yield line
|
||||||
|
|
||||||
|
def _send_login(self):
|
||||||
|
"""
|
||||||
|
Sends login string to server
|
||||||
|
"""
|
||||||
|
login_str = "user {0} pass {1} vers github.com/craigerl/aprsd {3}{2}\r\n"
|
||||||
|
login_str = login_str.format(
|
||||||
|
self.callsign,
|
||||||
|
self.passwd,
|
||||||
|
(" filter " + self.filter) if self.filter != "" else "",
|
||||||
|
aprsd.__version__,
|
||||||
|
)
|
||||||
|
|
||||||
|
self.logger.debug("Sending login information")
|
||||||
|
|
||||||
|
try:
|
||||||
|
self._sendall(login_str)
|
||||||
|
self.sock.settimeout(5)
|
||||||
|
test = self.sock.recv(len(login_str) + 100)
|
||||||
|
if is_py3:
|
||||||
|
test = test.decode("latin-1")
|
||||||
|
test = test.rstrip()
|
||||||
|
|
||||||
|
self.logger.debug("Server: '%s'", test)
|
||||||
|
|
||||||
|
if not test:
|
||||||
|
raise LoginError(f"Server Response Empty: '{test}'")
|
||||||
|
|
||||||
|
_, _, callsign, status, e = test.split(" ", 4)
|
||||||
|
s = e.split(",")
|
||||||
|
if len(s):
|
||||||
|
server_string = s[0].replace("server ", "")
|
||||||
else:
|
else:
|
||||||
keepalive = 'None'
|
server_string = e.replace("server ", "")
|
||||||
server_string = 'None'
|
|
||||||
filter = 'None'
|
|
||||||
stats = {
|
|
||||||
'connected': self.is_alive,
|
|
||||||
'filter': filter,
|
|
||||||
'login_status': self.login_status,
|
|
||||||
'connection_keepalive': keepalive,
|
|
||||||
'server_string': server_string,
|
|
||||||
'transport': self.transport(),
|
|
||||||
}
|
|
||||||
|
|
||||||
return stats
|
if callsign == "":
|
||||||
|
raise LoginError("Server responded with empty callsign???")
|
||||||
|
if callsign != self.callsign:
|
||||||
|
raise LoginError(f"Server: {test}")
|
||||||
|
if status != "verified," and self.passwd != "-1":
|
||||||
|
raise LoginError("Password is incorrect")
|
||||||
|
|
||||||
|
if self.passwd == "-1":
|
||||||
|
self.logger.info("Login successful (receive only)")
|
||||||
|
else:
|
||||||
|
self.logger.info("Login successful")
|
||||||
|
|
||||||
|
self.logger.info(f"Connected to {server_string}")
|
||||||
|
self.server_string = server_string
|
||||||
|
|
||||||
|
except LoginError as e:
|
||||||
|
self.logger.error(str(e))
|
||||||
|
self.close()
|
||||||
|
raise
|
||||||
|
except Exception as e:
|
||||||
|
self.close()
|
||||||
|
self.logger.error(f"Failed to login '{e}'")
|
||||||
|
self.logger.exception(e)
|
||||||
|
raise LoginError("Failed to login")
|
||||||
|
|
||||||
|
def consumer(self, callback, blocking=True, immortal=False, raw=False):
|
||||||
|
"""
|
||||||
|
When a position sentence is received, it will be passed to the callback function
|
||||||
|
|
||||||
|
blocking: if true (default), runs forever, otherwise will return after one sentence
|
||||||
|
You can still exit the loop, by raising StopIteration in the callback function
|
||||||
|
|
||||||
|
immortal: When true, consumer will try to reconnect and stop propagation of Parse exceptions
|
||||||
|
if false (default), consumer will return
|
||||||
|
|
||||||
|
raw: when true, raw packet is passed to callback, otherwise the result from aprs.parse()
|
||||||
|
"""
|
||||||
|
|
||||||
|
if not self._connected:
|
||||||
|
raise ConnectionError("not connected to a server")
|
||||||
|
|
||||||
|
line = b""
|
||||||
|
|
||||||
|
while True and not self.thread_stop:
|
||||||
|
try:
|
||||||
|
for line in self._socket_readlines(blocking):
|
||||||
|
if line[0:1] != b"#":
|
||||||
|
self.aprsd_keepalive = datetime.datetime.now()
|
||||||
|
if raw:
|
||||||
|
callback(line)
|
||||||
|
else:
|
||||||
|
callback(self._parse(line))
|
||||||
|
else:
|
||||||
|
self.logger.debug("Server: %s", line.decode("utf8"))
|
||||||
|
self.aprsd_keepalive = datetime.datetime.now()
|
||||||
|
except ParseError as exp:
|
||||||
|
self.logger.log(
|
||||||
|
11,
|
||||||
|
"%s Packet: '%s'",
|
||||||
|
exp,
|
||||||
|
exp.packet,
|
||||||
|
)
|
||||||
|
except UnknownFormat as exp:
|
||||||
|
self.logger.log(
|
||||||
|
9,
|
||||||
|
"%s Packet: '%s'",
|
||||||
|
exp,
|
||||||
|
exp.packet,
|
||||||
|
)
|
||||||
|
except LoginError as exp:
|
||||||
|
self.logger.error("%s: %s", exp.__class__.__name__, exp)
|
||||||
|
except (KeyboardInterrupt, SystemExit):
|
||||||
|
raise
|
||||||
|
except (ConnectionDrop, ConnectionError):
|
||||||
|
self.close()
|
||||||
|
|
||||||
|
if not immortal:
|
||||||
|
raise
|
||||||
|
else:
|
||||||
|
self.connect(blocking=blocking)
|
||||||
|
continue
|
||||||
|
except GenericError:
|
||||||
|
pass
|
||||||
|
except StopIteration:
|
||||||
|
break
|
||||||
|
except Exception:
|
||||||
|
self.logger.error("APRS Packet: %s", line)
|
||||||
|
raise
|
||||||
|
|
||||||
|
if not blocking:
|
||||||
|
break
|
||||||
|
@ -2,7 +2,6 @@ import datetime
|
|||||||
import logging
|
import logging
|
||||||
import threading
|
import threading
|
||||||
import time
|
import time
|
||||||
from typing import Callable
|
|
||||||
|
|
||||||
import aprslib
|
import aprslib
|
||||||
import wrapt
|
import wrapt
|
||||||
@ -16,7 +15,7 @@ CONF = cfg.CONF
|
|||||||
LOG = logging.getLogger('APRSD')
|
LOG = logging.getLogger('APRSD')
|
||||||
|
|
||||||
|
|
||||||
class APRSDFakeDriver(metaclass=trace.TraceWrapperMetaclass):
|
class APRSDFakeClient(metaclass=trace.TraceWrapperMetaclass):
|
||||||
"""Fake client for testing."""
|
"""Fake client for testing."""
|
||||||
|
|
||||||
# flag to tell us to stop
|
# flag to tell us to stop
|
||||||
@ -29,40 +28,17 @@ class APRSDFakeDriver(metaclass=trace.TraceWrapperMetaclass):
|
|||||||
path = []
|
path = []
|
||||||
|
|
||||||
def __init__(self):
|
def __init__(self):
|
||||||
LOG.info('Starting APRSDFakeDriver driver.')
|
LOG.info('Starting APRSDFakeClient client.')
|
||||||
self.path = ['WIDE1-1', 'WIDE2-1']
|
self.path = ['WIDE1-1', 'WIDE2-1']
|
||||||
|
|
||||||
@staticmethod
|
def stop(self):
|
||||||
def is_enabled():
|
self.thread_stop = True
|
||||||
if CONF.fake_client.enabled:
|
LOG.info('Shutdown APRSDFakeClient client.')
|
||||||
return True
|
|
||||||
return False
|
|
||||||
|
|
||||||
@staticmethod
|
|
||||||
def is_configured():
|
|
||||||
return APRSDFakeDriver.is_enabled
|
|
||||||
|
|
||||||
def is_alive(self):
|
def is_alive(self):
|
||||||
"""If the connection is alive or not."""
|
"""If the connection is alive or not."""
|
||||||
return not self.thread_stop
|
return not self.thread_stop
|
||||||
|
|
||||||
def close(self):
|
|
||||||
self.thread_stop = True
|
|
||||||
LOG.info('Shutdown APRSDFakeDriver driver.')
|
|
||||||
|
|
||||||
def setup_connection(self):
|
|
||||||
# It's fake....
|
|
||||||
pass
|
|
||||||
|
|
||||||
def set_filter(self, filter: str) -> None:
|
|
||||||
pass
|
|
||||||
|
|
||||||
def login_success(self) -> bool:
|
|
||||||
return True
|
|
||||||
|
|
||||||
def login_failure(self) -> str:
|
|
||||||
return None
|
|
||||||
|
|
||||||
@wrapt.synchronized(lock)
|
@wrapt.synchronized(lock)
|
||||||
def send(self, packet: core.Packet):
|
def send(self, packet: core.Packet):
|
||||||
"""Send an APRS Message object."""
|
"""Send an APRS Message object."""
|
||||||
@ -85,37 +61,13 @@ class APRSDFakeDriver(metaclass=trace.TraceWrapperMetaclass):
|
|||||||
f'\'{packet.from_call}\' with PATH "{self.path}"',
|
f'\'{packet.from_call}\' with PATH "{self.path}"',
|
||||||
)
|
)
|
||||||
|
|
||||||
def consumer(self, callback: Callable, raw: bool = False):
|
def consumer(self, callback, blocking=False, immortal=False, raw=False):
|
||||||
LOG.debug('Start non blocking FAKE consumer')
|
LOG.debug('Start non blocking FAKE consumer')
|
||||||
# Generate packets here?
|
# Generate packets here?
|
||||||
raw_str = 'GTOWN>APDW16,WIDE1-1,WIDE2-1:}KM6LYW-9>APZ100,TCPIP,GTOWN*::KM6LYW :KM6LYW: 19 Miles SW'
|
raw = 'GTOWN>APDW16,WIDE1-1,WIDE2-1:}KM6LYW-9>APZ100,TCPIP,GTOWN*::KM6LYW :KM6LYW: 19 Miles SW'
|
||||||
|
pkt_raw = aprslib.parse(raw)
|
||||||
|
pkt = core.factory(pkt_raw)
|
||||||
self.aprsd_keepalive = datetime.datetime.now()
|
self.aprsd_keepalive = datetime.datetime.now()
|
||||||
if raw:
|
callback(packet=pkt)
|
||||||
callback(raw=raw_str)
|
|
||||||
else:
|
|
||||||
pkt_raw = aprslib.parse(raw_str)
|
|
||||||
pkt = core.factory(pkt_raw)
|
|
||||||
callback(packet=pkt)
|
|
||||||
|
|
||||||
LOG.debug(f'END blocking FAKE consumer {self}')
|
LOG.debug(f'END blocking FAKE consumer {self}')
|
||||||
time.sleep(1)
|
time.sleep(8)
|
||||||
|
|
||||||
def decode_packet(self, *args, **kwargs):
|
|
||||||
"""APRS lib already decodes this."""
|
|
||||||
if not kwargs:
|
|
||||||
return None
|
|
||||||
|
|
||||||
if kwargs.get('packet'):
|
|
||||||
return kwargs.get('packet')
|
|
||||||
|
|
||||||
if kwargs.get('raw'):
|
|
||||||
pkt_raw = aprslib.parse(kwargs.get('raw'))
|
|
||||||
pkt = core.factory(pkt_raw)
|
|
||||||
return pkt
|
|
||||||
|
|
||||||
def stats(self, serializable: bool = False) -> dict:
|
|
||||||
return {
|
|
||||||
'driver': self.__class__.__name__,
|
|
||||||
'is_alive': self.is_alive(),
|
|
||||||
'transport': 'fake',
|
|
||||||
}
|
|
||||||
|
121
aprsd/client/drivers/kiss.py
Normal file
121
aprsd/client/drivers/kiss.py
Normal file
@ -0,0 +1,121 @@
|
|||||||
|
import datetime
|
||||||
|
import logging
|
||||||
|
|
||||||
|
import kiss
|
||||||
|
from ax253 import Frame
|
||||||
|
from oslo_config import cfg
|
||||||
|
|
||||||
|
from aprsd import conf # noqa
|
||||||
|
from aprsd.packets import core
|
||||||
|
from aprsd.utils import trace
|
||||||
|
|
||||||
|
CONF = cfg.CONF
|
||||||
|
LOG = logging.getLogger('APRSD')
|
||||||
|
|
||||||
|
|
||||||
|
class KISS3Client:
|
||||||
|
path = []
|
||||||
|
|
||||||
|
# date for last time we heard from the server
|
||||||
|
aprsd_keepalive = datetime.datetime.now()
|
||||||
|
|
||||||
|
def __init__(self):
|
||||||
|
self.setup()
|
||||||
|
|
||||||
|
def is_alive(self):
|
||||||
|
return True
|
||||||
|
|
||||||
|
def setup(self):
|
||||||
|
# we can be TCP kiss or Serial kiss
|
||||||
|
if CONF.kiss_serial.enabled:
|
||||||
|
LOG.debug(
|
||||||
|
'KISS({}) Serial connection to {}'.format(
|
||||||
|
kiss.__version__,
|
||||||
|
CONF.kiss_serial.device,
|
||||||
|
),
|
||||||
|
)
|
||||||
|
self.kiss = kiss.SerialKISS(
|
||||||
|
port=CONF.kiss_serial.device,
|
||||||
|
speed=CONF.kiss_serial.baudrate,
|
||||||
|
strip_df_start=True,
|
||||||
|
)
|
||||||
|
self.path = CONF.kiss_serial.path
|
||||||
|
elif CONF.kiss_tcp.enabled:
|
||||||
|
LOG.debug(
|
||||||
|
'KISS({}) TCP Connection to {}:{}'.format(
|
||||||
|
kiss.__version__,
|
||||||
|
CONF.kiss_tcp.host,
|
||||||
|
CONF.kiss_tcp.port,
|
||||||
|
),
|
||||||
|
)
|
||||||
|
self.kiss = kiss.TCPKISS(
|
||||||
|
host=CONF.kiss_tcp.host,
|
||||||
|
port=CONF.kiss_tcp.port,
|
||||||
|
strip_df_start=True,
|
||||||
|
)
|
||||||
|
self.path = CONF.kiss_tcp.path
|
||||||
|
|
||||||
|
LOG.debug('Starting KISS interface connection')
|
||||||
|
self.kiss.start()
|
||||||
|
|
||||||
|
@trace.trace
|
||||||
|
def stop(self):
|
||||||
|
try:
|
||||||
|
self.kiss.stop()
|
||||||
|
self.kiss.loop.call_soon_threadsafe(
|
||||||
|
self.kiss.protocol.transport.close,
|
||||||
|
)
|
||||||
|
except Exception as ex:
|
||||||
|
LOG.exception(ex)
|
||||||
|
|
||||||
|
def set_filter(self, filter):
|
||||||
|
# This does nothing right now.
|
||||||
|
pass
|
||||||
|
|
||||||
|
def parse_frame(self, frame_bytes):
|
||||||
|
try:
|
||||||
|
frame = Frame.from_bytes(frame_bytes)
|
||||||
|
# Now parse it with aprslib
|
||||||
|
kwargs = {
|
||||||
|
'frame': frame,
|
||||||
|
}
|
||||||
|
self._parse_callback(**kwargs)
|
||||||
|
self.aprsd_keepalive = datetime.datetime.now()
|
||||||
|
except Exception as ex:
|
||||||
|
LOG.error('Failed to parse bytes received from KISS interface.')
|
||||||
|
LOG.exception(ex)
|
||||||
|
|
||||||
|
def consumer(self, callback):
|
||||||
|
self._parse_callback = callback
|
||||||
|
self.kiss.read(callback=self.parse_frame, min_frames=None)
|
||||||
|
|
||||||
|
def send(self, packet):
|
||||||
|
"""Send an APRS Message object."""
|
||||||
|
|
||||||
|
payload = None
|
||||||
|
path = self.path
|
||||||
|
if isinstance(packet, core.Packet):
|
||||||
|
packet.prepare()
|
||||||
|
payload = packet.payload.encode('US-ASCII')
|
||||||
|
if packet.path:
|
||||||
|
path = packet.path
|
||||||
|
else:
|
||||||
|
msg_payload = f'{packet.raw}{{{str(packet.msgNo)}'
|
||||||
|
payload = (
|
||||||
|
':{:<9}:{}'.format(
|
||||||
|
packet.to_call,
|
||||||
|
msg_payload,
|
||||||
|
)
|
||||||
|
).encode('US-ASCII')
|
||||||
|
|
||||||
|
LOG.debug(
|
||||||
|
f"KISS Send '{payload}' TO '{packet.to_call}' From "
|
||||||
|
f"'{packet.from_call}' with PATH '{path}'",
|
||||||
|
)
|
||||||
|
frame = Frame.ui(
|
||||||
|
destination='APZ100',
|
||||||
|
source=packet.from_call,
|
||||||
|
path=path,
|
||||||
|
info=payload,
|
||||||
|
)
|
||||||
|
self.kiss.write(frame)
|
@ -1,296 +0,0 @@
|
|||||||
import datetime
|
|
||||||
import logging
|
|
||||||
import select
|
|
||||||
import socket
|
|
||||||
import threading
|
|
||||||
|
|
||||||
import aprslib
|
|
||||||
import wrapt
|
|
||||||
from aprslib import is_py3
|
|
||||||
from aprslib.exceptions import (
|
|
||||||
ConnectionDrop,
|
|
||||||
ConnectionError,
|
|
||||||
GenericError,
|
|
||||||
LoginError,
|
|
||||||
ParseError,
|
|
||||||
UnknownFormat,
|
|
||||||
)
|
|
||||||
|
|
||||||
import aprsd
|
|
||||||
from aprsd.packets import core
|
|
||||||
|
|
||||||
LOG = logging.getLogger('APRSD')
|
|
||||||
|
|
||||||
|
|
||||||
class APRSLibClient(aprslib.IS):
|
|
||||||
"""Extend the aprslib class so we can exit properly.
|
|
||||||
|
|
||||||
This is a modified version of the aprslib.IS class that adds a stop method to
|
|
||||||
allow the client to exit cleanly.
|
|
||||||
|
|
||||||
The aprsis driver uses this class to connect to the APRS-IS server.
|
|
||||||
"""
|
|
||||||
|
|
||||||
# flag to tell us to stop
|
|
||||||
thread_stop = False
|
|
||||||
|
|
||||||
# date for last time we heard from the server
|
|
||||||
aprsd_keepalive = datetime.datetime.now()
|
|
||||||
|
|
||||||
# Which server we are connected to?
|
|
||||||
server_string = 'None'
|
|
||||||
|
|
||||||
# timeout in seconds
|
|
||||||
select_timeout = 1
|
|
||||||
lock = threading.Lock()
|
|
||||||
|
|
||||||
def stop(self):
|
|
||||||
self.thread_stop = True
|
|
||||||
LOG.warning('Shutdown Aprsdis client.')
|
|
||||||
|
|
||||||
def close(self):
|
|
||||||
LOG.warning('Closing Aprsdis client.')
|
|
||||||
super().close()
|
|
||||||
|
|
||||||
@wrapt.synchronized(lock)
|
|
||||||
def send(self, packet: core.Packet):
|
|
||||||
"""Send an APRS Message object."""
|
|
||||||
self.sendall(packet.raw)
|
|
||||||
|
|
||||||
def is_alive(self):
|
|
||||||
"""If the connection is alive or not."""
|
|
||||||
return self._connected
|
|
||||||
|
|
||||||
def _connect(self):
|
|
||||||
"""
|
|
||||||
Attemps connection to the server
|
|
||||||
"""
|
|
||||||
|
|
||||||
self.logger.info(
|
|
||||||
'Attempting connection to %s:%s', self.server[0], self.server[1]
|
|
||||||
)
|
|
||||||
|
|
||||||
try:
|
|
||||||
self._open_socket()
|
|
||||||
|
|
||||||
peer = self.sock.getpeername()
|
|
||||||
|
|
||||||
self.logger.info('Connected to %s', str(peer))
|
|
||||||
|
|
||||||
# 5 second timeout to receive server banner
|
|
||||||
self.sock.setblocking(1)
|
|
||||||
self.sock.settimeout(5)
|
|
||||||
|
|
||||||
self.sock.setsockopt(socket.SOL_SOCKET, socket.SO_KEEPALIVE, 1)
|
|
||||||
# MACOS doesn't have TCP_KEEPIDLE
|
|
||||||
if hasattr(socket, 'TCP_KEEPIDLE'):
|
|
||||||
self.sock.setsockopt(socket.IPPROTO_TCP, socket.TCP_KEEPIDLE, 1)
|
|
||||||
self.sock.setsockopt(socket.IPPROTO_TCP, socket.TCP_KEEPINTVL, 3)
|
|
||||||
self.sock.setsockopt(socket.IPPROTO_TCP, socket.TCP_KEEPCNT, 5)
|
|
||||||
|
|
||||||
banner = self.sock.recv(512)
|
|
||||||
if is_py3:
|
|
||||||
banner = banner.decode('latin-1')
|
|
||||||
|
|
||||||
if banner[0] == '#':
|
|
||||||
self.logger.debug('Banner: %s', banner.rstrip())
|
|
||||||
else:
|
|
||||||
raise ConnectionError('invalid banner from server')
|
|
||||||
|
|
||||||
except ConnectionError as e:
|
|
||||||
self.logger.error(str(e))
|
|
||||||
self.close()
|
|
||||||
raise
|
|
||||||
except (socket.error, socket.timeout) as e:
|
|
||||||
self.close()
|
|
||||||
|
|
||||||
self.logger.error('Socket error: %s' % str(e))
|
|
||||||
if str(e) == 'timed out':
|
|
||||||
raise ConnectionError('no banner from server') from e
|
|
||||||
else:
|
|
||||||
raise ConnectionError(e) from e
|
|
||||||
|
|
||||||
self._connected = True
|
|
||||||
|
|
||||||
def _socket_readlines(self, blocking=False):
|
|
||||||
"""
|
|
||||||
Generator for complete lines, received from the server
|
|
||||||
"""
|
|
||||||
try:
|
|
||||||
self.sock.setblocking(0)
|
|
||||||
except OSError as e:
|
|
||||||
self.logger.error(f'socket error when setblocking(0): {str(e)}')
|
|
||||||
raise aprslib.ConnectionDrop('connection dropped') from e
|
|
||||||
|
|
||||||
while not self.thread_stop:
|
|
||||||
short_buf = b''
|
|
||||||
newline = b'\r\n'
|
|
||||||
|
|
||||||
# set a select timeout, so we get a chance to exit
|
|
||||||
# when user hits CTRL-C
|
|
||||||
readable, writable, exceptional = select.select(
|
|
||||||
[self.sock],
|
|
||||||
[],
|
|
||||||
[],
|
|
||||||
self.select_timeout,
|
|
||||||
)
|
|
||||||
if not readable:
|
|
||||||
if not blocking:
|
|
||||||
break
|
|
||||||
else:
|
|
||||||
continue
|
|
||||||
|
|
||||||
try:
|
|
||||||
short_buf = self.sock.recv(4096)
|
|
||||||
|
|
||||||
# sock.recv returns empty if the connection drops
|
|
||||||
if not short_buf:
|
|
||||||
if not blocking:
|
|
||||||
# We could just not be blocking, so empty is expected
|
|
||||||
continue
|
|
||||||
else:
|
|
||||||
self.logger.error('socket.recv(): returned empty')
|
|
||||||
raise aprslib.ConnectionDrop('connection dropped')
|
|
||||||
except OSError as e:
|
|
||||||
# self.logger.error("socket error on recv(): %s" % str(e))
|
|
||||||
if 'Resource temporarily unavailable' in str(e):
|
|
||||||
if not blocking:
|
|
||||||
if len(self.buf) == 0:
|
|
||||||
break
|
|
||||||
|
|
||||||
self.buf += short_buf
|
|
||||||
|
|
||||||
while newline in self.buf:
|
|
||||||
line, self.buf = self.buf.split(newline, 1)
|
|
||||||
|
|
||||||
yield line
|
|
||||||
|
|
||||||
def _send_login(self):
|
|
||||||
"""
|
|
||||||
Sends login string to server
|
|
||||||
"""
|
|
||||||
login_str = 'user {0} pass {1} vers Python-APRSD {3}{2}\r\n'
|
|
||||||
login_str = login_str.format(
|
|
||||||
self.callsign,
|
|
||||||
self.passwd,
|
|
||||||
(' filter ' + self.filter) if self.filter != '' else '',
|
|
||||||
aprsd.__version__,
|
|
||||||
)
|
|
||||||
|
|
||||||
self.logger.debug('Sending login information')
|
|
||||||
|
|
||||||
try:
|
|
||||||
self._sendall(login_str)
|
|
||||||
self.sock.settimeout(5)
|
|
||||||
test = self.sock.recv(len(login_str) + 100)
|
|
||||||
if is_py3:
|
|
||||||
test = test.decode('latin-1')
|
|
||||||
test = test.rstrip()
|
|
||||||
|
|
||||||
self.logger.debug("Server: '%s'", test)
|
|
||||||
|
|
||||||
if not test:
|
|
||||||
raise LoginError(f"Server Response Empty: '{test}'")
|
|
||||||
|
|
||||||
_, _, callsign, status, e = test.split(' ', 4)
|
|
||||||
s = e.split(',')
|
|
||||||
if len(s):
|
|
||||||
server_string = s[0].replace('server ', '')
|
|
||||||
else:
|
|
||||||
server_string = e.replace('server ', '')
|
|
||||||
|
|
||||||
if callsign == '':
|
|
||||||
raise LoginError('Server responded with empty callsign???')
|
|
||||||
if callsign != self.callsign:
|
|
||||||
raise LoginError(f'Server: {test}')
|
|
||||||
if status != 'verified,' and self.passwd != '-1':
|
|
||||||
raise LoginError('Password is incorrect')
|
|
||||||
|
|
||||||
if self.passwd == '-1':
|
|
||||||
self.logger.info('Login successful (receive only)')
|
|
||||||
else:
|
|
||||||
self.logger.info('Login successful')
|
|
||||||
|
|
||||||
self.logger.info(f'Connected to {server_string}')
|
|
||||||
self.server_string = server_string
|
|
||||||
|
|
||||||
except LoginError as e:
|
|
||||||
self.logger.error(str(e))
|
|
||||||
self.close()
|
|
||||||
raise
|
|
||||||
except Exception as e:
|
|
||||||
self.close()
|
|
||||||
self.logger.error(f"Failed to login '{e}'")
|
|
||||||
self.logger.exception(e)
|
|
||||||
raise LoginError('Failed to login') from e
|
|
||||||
|
|
||||||
def consumer(self, callback, blocking=True, immortal=False, raw=False):
|
|
||||||
"""
|
|
||||||
When a position sentence is received, it will be passed to the callback function
|
|
||||||
|
|
||||||
blocking: if true (default), runs forever, otherwise will return after one sentence
|
|
||||||
You can still exit the loop, by raising StopIteration in the callback function
|
|
||||||
|
|
||||||
immortal: When true, consumer will try to reconnect and stop propagation of Parse exceptions
|
|
||||||
if false (default), consumer will return
|
|
||||||
|
|
||||||
raw: when true, raw packet is passed to callback, otherwise the result from aprs.parse()
|
|
||||||
"""
|
|
||||||
|
|
||||||
if not self._connected:
|
|
||||||
raise ConnectionError('not connected to a server')
|
|
||||||
|
|
||||||
line = b''
|
|
||||||
|
|
||||||
while not self.thread_stop:
|
|
||||||
try:
|
|
||||||
for line in self._socket_readlines(blocking):
|
|
||||||
if line[0:1] != b'#':
|
|
||||||
self.aprsd_keepalive = datetime.datetime.now()
|
|
||||||
if raw:
|
|
||||||
callback(line)
|
|
||||||
else:
|
|
||||||
callback(self._parse(line))
|
|
||||||
else:
|
|
||||||
self.logger.debug('Server: %s', line.decode('utf8'))
|
|
||||||
self.aprsd_keepalive = datetime.datetime.now()
|
|
||||||
except ParseError as exp:
|
|
||||||
self.logger.log(
|
|
||||||
11,
|
|
||||||
"%s Packet: '%s'",
|
|
||||||
exp,
|
|
||||||
exp.packet,
|
|
||||||
)
|
|
||||||
except UnknownFormat as exp:
|
|
||||||
self.logger.log(
|
|
||||||
9,
|
|
||||||
"%s Packet: '%s'",
|
|
||||||
exp,
|
|
||||||
exp.packet,
|
|
||||||
)
|
|
||||||
except LoginError as exp:
|
|
||||||
self.logger.error('%s: %s', exp.__class__.__name__, exp)
|
|
||||||
except (KeyboardInterrupt, SystemExit):
|
|
||||||
raise
|
|
||||||
except (ConnectionDrop, ConnectionError):
|
|
||||||
self.close()
|
|
||||||
|
|
||||||
if not immortal:
|
|
||||||
raise
|
|
||||||
else:
|
|
||||||
self.connect(blocking=blocking)
|
|
||||||
continue
|
|
||||||
except GenericError:
|
|
||||||
pass
|
|
||||||
except StopIteration:
|
|
||||||
break
|
|
||||||
except IOError:
|
|
||||||
if not self.thread_stop:
|
|
||||||
self.logger.error('IOError')
|
|
||||||
break
|
|
||||||
except Exception:
|
|
||||||
self.logger.error('APRS Packet: %s', line)
|
|
||||||
raise
|
|
||||||
|
|
||||||
if not blocking:
|
|
||||||
break
|
|
@ -1,86 +0,0 @@
|
|||||||
from typing import Callable, Protocol, runtime_checkable
|
|
||||||
|
|
||||||
from aprsd.packets import core
|
|
||||||
from aprsd.utils import singleton, trace
|
|
||||||
|
|
||||||
|
|
||||||
@runtime_checkable
|
|
||||||
class ClientDriver(Protocol):
|
|
||||||
"""Protocol for APRSD client drivers.
|
|
||||||
|
|
||||||
This protocol defines the methods that must be
|
|
||||||
implemented by APRSD client drivers.
|
|
||||||
"""
|
|
||||||
|
|
||||||
@staticmethod
|
|
||||||
def is_enabled(self) -> bool:
|
|
||||||
pass
|
|
||||||
|
|
||||||
@staticmethod
|
|
||||||
def is_configured(self) -> bool:
|
|
||||||
pass
|
|
||||||
|
|
||||||
def is_alive(self) -> bool:
|
|
||||||
pass
|
|
||||||
|
|
||||||
def close(self) -> None:
|
|
||||||
pass
|
|
||||||
|
|
||||||
def send(self, packet: core.Packet) -> bool:
|
|
||||||
pass
|
|
||||||
|
|
||||||
def setup_connection(self) -> None:
|
|
||||||
pass
|
|
||||||
|
|
||||||
def set_filter(self, filter: str) -> None:
|
|
||||||
pass
|
|
||||||
|
|
||||||
def login_success(self) -> bool:
|
|
||||||
pass
|
|
||||||
|
|
||||||
def login_failure(self) -> str:
|
|
||||||
pass
|
|
||||||
|
|
||||||
def consumer(self, callback: Callable, raw: bool = False) -> None:
|
|
||||||
pass
|
|
||||||
|
|
||||||
def decode_packet(self, *args, **kwargs) -> core.Packet:
|
|
||||||
pass
|
|
||||||
|
|
||||||
def stats(self, serializable: bool = False) -> dict:
|
|
||||||
pass
|
|
||||||
|
|
||||||
|
|
||||||
@singleton
|
|
||||||
class DriverRegistry(metaclass=trace.TraceWrapperMetaclass):
|
|
||||||
"""Registry for APRSD client drivers.
|
|
||||||
|
|
||||||
This registry is used to register and unregister APRSD client drivers.
|
|
||||||
|
|
||||||
This allows us to dynamically load the configured driver at runtime.
|
|
||||||
|
|
||||||
All drivers are registered, then when aprsd needs the client, the
|
|
||||||
registry provides the configured driver for the single instance of the
|
|
||||||
single APRSD client.
|
|
||||||
"""
|
|
||||||
|
|
||||||
def __init__(self):
|
|
||||||
self.drivers = []
|
|
||||||
|
|
||||||
def register(self, driver: Callable):
|
|
||||||
if not isinstance(driver, ClientDriver):
|
|
||||||
raise ValueError('Driver must be of ClientDriver type')
|
|
||||||
self.drivers.append(driver)
|
|
||||||
|
|
||||||
def unregister(self, driver: Callable):
|
|
||||||
if driver in self.drivers:
|
|
||||||
self.drivers.remove(driver)
|
|
||||||
else:
|
|
||||||
raise ValueError(f'Driver {driver} not found')
|
|
||||||
|
|
||||||
def get_driver(self) -> ClientDriver:
|
|
||||||
"""Get the first enabled driver."""
|
|
||||||
for driver in self.drivers:
|
|
||||||
if driver.is_enabled() and driver.is_configured():
|
|
||||||
return driver()
|
|
||||||
raise ValueError('No enabled driver found')
|
|
@ -1,408 +0,0 @@
|
|||||||
"""
|
|
||||||
APRSD KISS Client Driver using native KISS implementation.
|
|
||||||
|
|
||||||
This module provides a KISS client driver for APRSD using the new
|
|
||||||
non-asyncio KISSInterface implementation.
|
|
||||||
"""
|
|
||||||
|
|
||||||
import datetime
|
|
||||||
import logging
|
|
||||||
import select
|
|
||||||
import socket
|
|
||||||
import time
|
|
||||||
from typing import Any, Callable, Dict
|
|
||||||
|
|
||||||
import aprslib
|
|
||||||
from ax253 import frame as ax25frame
|
|
||||||
from kiss import constants as kiss_constants
|
|
||||||
from kiss import util as kissutil
|
|
||||||
from kiss.kiss import Command
|
|
||||||
from oslo_config import cfg
|
|
||||||
|
|
||||||
from aprsd import ( # noqa
|
|
||||||
client,
|
|
||||||
conf, # noqa
|
|
||||||
exception,
|
|
||||||
)
|
|
||||||
from aprsd.packets import core
|
|
||||||
|
|
||||||
CONF = cfg.CONF
|
|
||||||
LOG = logging.getLogger('APRSD')
|
|
||||||
|
|
||||||
|
|
||||||
def handle_fend(buffer: bytes, strip_df_start: bool = True) -> bytes:
|
|
||||||
"""
|
|
||||||
Handle FEND (end of frame) encountered in a KISS data stream.
|
|
||||||
|
|
||||||
:param buffer: the buffer containing the frame
|
|
||||||
:param strip_df_start: remove leading null byte (DATA_FRAME opcode)
|
|
||||||
:return: the bytes of the frame without escape characters or frame
|
|
||||||
end markers (FEND)
|
|
||||||
"""
|
|
||||||
frame = kissutil.recover_special_codes(kissutil.strip_nmea(bytes(buffer)))
|
|
||||||
if strip_df_start:
|
|
||||||
frame = kissutil.strip_df_start(frame)
|
|
||||||
LOG.warning(f'handle_fend {" ".join(f"{b:02X}" for b in bytes(frame))}')
|
|
||||||
return bytes(frame)
|
|
||||||
|
|
||||||
|
|
||||||
# class TCPKISSDriver(metaclass=trace.TraceWrapperMetaclass):
|
|
||||||
class TCPKISSDriver:
|
|
||||||
"""APRSD client driver for TCP KISS connections."""
|
|
||||||
|
|
||||||
# Class level attributes required by Client protocol
|
|
||||||
packets_received = 0
|
|
||||||
packets_sent = 0
|
|
||||||
last_packet_sent = None
|
|
||||||
last_packet_received = None
|
|
||||||
keepalive = None
|
|
||||||
client_name = None
|
|
||||||
socket = None
|
|
||||||
# timeout in seconds
|
|
||||||
select_timeout = 1
|
|
||||||
path = None
|
|
||||||
|
|
||||||
def __init__(self):
|
|
||||||
"""Initialize the KISS client.
|
|
||||||
|
|
||||||
Args:
|
|
||||||
client_name: Name of the client instance
|
|
||||||
"""
|
|
||||||
super().__init__()
|
|
||||||
self._connected = False
|
|
||||||
self.keepalive = datetime.datetime.now()
|
|
||||||
self._running = False
|
|
||||||
# This is initialized in setup_connection()
|
|
||||||
self.socket = None
|
|
||||||
|
|
||||||
@property
|
|
||||||
def transport(self) -> str:
|
|
||||||
return client.TRANSPORT_TCPKISS
|
|
||||||
|
|
||||||
@classmethod
|
|
||||||
def is_enabled(cls) -> bool:
|
|
||||||
"""Check if KISS is enabled in configuration.
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
bool: True if either TCP is enabled
|
|
||||||
"""
|
|
||||||
return CONF.kiss_tcp.enabled
|
|
||||||
|
|
||||||
@staticmethod
|
|
||||||
def is_configured():
|
|
||||||
# Ensure that the config vars are correctly set
|
|
||||||
if TCPKISSDriver.is_enabled():
|
|
||||||
if not CONF.kiss_tcp.host:
|
|
||||||
LOG.error('KISS TCP enabled, but no host is set.')
|
|
||||||
raise exception.MissingConfigOptionException(
|
|
||||||
'kiss_tcp.host is not set.',
|
|
||||||
)
|
|
||||||
return True
|
|
||||||
return False
|
|
||||||
|
|
||||||
@property
|
|
||||||
def is_alive(self) -> bool:
|
|
||||||
"""Check if the client is connected.
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
bool: True if connected to KISS TNC, False otherwise
|
|
||||||
"""
|
|
||||||
return self._connected
|
|
||||||
|
|
||||||
def close(self):
|
|
||||||
"""Close the connection."""
|
|
||||||
self.stop()
|
|
||||||
|
|
||||||
def send(self, packet: core.Packet):
|
|
||||||
"""Send an APRS packet.
|
|
||||||
|
|
||||||
Args:
|
|
||||||
packet: APRS packet to send (Packet or Message object)
|
|
||||||
|
|
||||||
Raises:
|
|
||||||
Exception: If not connected or send fails
|
|
||||||
"""
|
|
||||||
if not self.socket:
|
|
||||||
raise Exception('KISS interface not initialized')
|
|
||||||
|
|
||||||
payload = None
|
|
||||||
path = self.path
|
|
||||||
packet.prepare()
|
|
||||||
payload = packet.payload.encode('US-ASCII')
|
|
||||||
if packet.path:
|
|
||||||
path = packet.path
|
|
||||||
|
|
||||||
LOG.debug(
|
|
||||||
f"KISS Send '{payload}' TO '{packet.to_call}' From "
|
|
||||||
f"'{packet.from_call}' with PATH '{path}'",
|
|
||||||
)
|
|
||||||
frame = ax25frame.Frame.ui(
|
|
||||||
destination='APZ100',
|
|
||||||
# destination=packet.to_call,
|
|
||||||
source=packet.from_call,
|
|
||||||
path=path,
|
|
||||||
info=payload,
|
|
||||||
)
|
|
||||||
|
|
||||||
# now escape the frame special characters
|
|
||||||
frame_escaped = kissutil.escape_special_codes(bytes(frame))
|
|
||||||
# and finally wrap the frame in KISS protocol
|
|
||||||
command = Command.DATA_FRAME
|
|
||||||
frame_kiss = b''.join(
|
|
||||||
[kiss_constants.FEND, command.value, frame_escaped, kiss_constants.FEND]
|
|
||||||
)
|
|
||||||
self.socket.send(frame_kiss)
|
|
||||||
# Update last packet sent time
|
|
||||||
self.last_packet_sent = datetime.datetime.now()
|
|
||||||
# Increment packets sent counter
|
|
||||||
self.packets_sent += 1
|
|
||||||
|
|
||||||
def setup_connection(self):
|
|
||||||
"""Set up the KISS interface."""
|
|
||||||
if not self.is_enabled():
|
|
||||||
LOG.error('KISS is not enabled in configuration')
|
|
||||||
return
|
|
||||||
|
|
||||||
try:
|
|
||||||
# Configure for TCP KISS
|
|
||||||
if self.is_enabled():
|
|
||||||
LOG.info(
|
|
||||||
f'KISS TCP Connection to {CONF.kiss_tcp.host}:{CONF.kiss_tcp.port}'
|
|
||||||
)
|
|
||||||
self.path = CONF.kiss_tcp.path
|
|
||||||
self.connect()
|
|
||||||
if self._connected:
|
|
||||||
LOG.info('KISS interface initialized')
|
|
||||||
else:
|
|
||||||
LOG.error('Failed to connect to KISS interface')
|
|
||||||
|
|
||||||
except Exception as ex:
|
|
||||||
LOG.error('Failed to initialize KISS interface')
|
|
||||||
LOG.exception(ex)
|
|
||||||
self._connected = False
|
|
||||||
|
|
||||||
def set_filter(self, filter_text: str):
|
|
||||||
"""Set packet filter (not implemented for KISS).
|
|
||||||
|
|
||||||
Args:
|
|
||||||
filter_text: Filter specification (ignored for KISS)
|
|
||||||
"""
|
|
||||||
# KISS doesn't support filtering at the TNC level
|
|
||||||
pass
|
|
||||||
|
|
||||||
@property
|
|
||||||
def filter(self) -> str:
|
|
||||||
"""Get packet filter (not implemented for KISS).
|
|
||||||
Returns:
|
|
||||||
str: Empty string (not implemented for KISS)
|
|
||||||
"""
|
|
||||||
return ''
|
|
||||||
|
|
||||||
def login_success(self) -> bool:
|
|
||||||
"""There is no login for KISS."""
|
|
||||||
if not self._connected:
|
|
||||||
return False
|
|
||||||
return True
|
|
||||||
|
|
||||||
def login_failure(self) -> str:
|
|
||||||
"""There is no login for KISS."""
|
|
||||||
return 'Login successful'
|
|
||||||
|
|
||||||
def consumer(self, callback: Callable, raw: bool = False):
|
|
||||||
"""Start consuming frames with the given callback.
|
|
||||||
|
|
||||||
Args:
|
|
||||||
callback: Function to call with received packets
|
|
||||||
|
|
||||||
Raises:
|
|
||||||
Exception: If not connected to KISS TNC
|
|
||||||
"""
|
|
||||||
self._running = True
|
|
||||||
while self._running:
|
|
||||||
# Ensure connection
|
|
||||||
if not self._connected:
|
|
||||||
if not self.connect():
|
|
||||||
time.sleep(1)
|
|
||||||
continue
|
|
||||||
|
|
||||||
# Read frame
|
|
||||||
frame = self.read_frame()
|
|
||||||
if frame:
|
|
||||||
LOG.warning(f'GOT FRAME: {frame} calling {callback}')
|
|
||||||
kwargs = {
|
|
||||||
'frame': frame,
|
|
||||||
}
|
|
||||||
callback(**kwargs)
|
|
||||||
|
|
||||||
def decode_packet(self, *args, **kwargs) -> core.Packet:
|
|
||||||
"""Decode a packet from an AX.25 frame.
|
|
||||||
|
|
||||||
Args:
|
|
||||||
frame: Received AX.25 frame
|
|
||||||
"""
|
|
||||||
frame = kwargs.get('frame')
|
|
||||||
if not frame:
|
|
||||||
LOG.warning('No frame received to decode?!?!')
|
|
||||||
return None
|
|
||||||
|
|
||||||
LOG.warning(f'FRAME: {str(frame)}')
|
|
||||||
try:
|
|
||||||
aprslib_frame = aprslib.parse(str(frame))
|
|
||||||
return core.factory(aprslib_frame)
|
|
||||||
except Exception as e:
|
|
||||||
LOG.error(f'Error decoding packet: {e}')
|
|
||||||
return None
|
|
||||||
|
|
||||||
def stop(self):
|
|
||||||
"""Stop the KISS interface."""
|
|
||||||
self._running = False
|
|
||||||
self._connected = False
|
|
||||||
if self.socket:
|
|
||||||
try:
|
|
||||||
self.socket.close()
|
|
||||||
except Exception:
|
|
||||||
pass
|
|
||||||
|
|
||||||
def stats(self, serializable: bool = False) -> Dict[str, Any]:
|
|
||||||
"""Get client statistics.
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
Dict containing client statistics
|
|
||||||
"""
|
|
||||||
if serializable:
|
|
||||||
keepalive = self.keepalive.isoformat()
|
|
||||||
else:
|
|
||||||
keepalive = self.keepalive
|
|
||||||
stats = {
|
|
||||||
'client': self.__class__.__name__,
|
|
||||||
'transport': self.transport,
|
|
||||||
'connected': self._connected,
|
|
||||||
'path': self.path,
|
|
||||||
'packets_sent': self.packets_sent,
|
|
||||||
'packets_received': self.packets_received,
|
|
||||||
'last_packet_sent': self.last_packet_sent,
|
|
||||||
'last_packet_received': self.last_packet_received,
|
|
||||||
'connection_keepalive': keepalive,
|
|
||||||
'host': CONF.kiss_tcp.host,
|
|
||||||
'port': CONF.kiss_tcp.port,
|
|
||||||
}
|
|
||||||
|
|
||||||
return stats
|
|
||||||
|
|
||||||
def connect(self) -> bool:
|
|
||||||
"""Establish TCP connection to the KISS host.
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
bool: True if connection successful, False otherwise
|
|
||||||
"""
|
|
||||||
try:
|
|
||||||
if self.socket:
|
|
||||||
try:
|
|
||||||
self.socket.close()
|
|
||||||
except Exception:
|
|
||||||
pass
|
|
||||||
|
|
||||||
self.socket = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
|
|
||||||
self.socket.settimeout(5.0) # 5 second timeout for connection
|
|
||||||
self.socket.connect((CONF.kiss_tcp.host, CONF.kiss_tcp.port))
|
|
||||||
self.socket.settimeout(0.1) # Reset to shorter timeout for reads
|
|
||||||
self._connected = True
|
|
||||||
self.socket.setsockopt(socket.SOL_SOCKET, socket.SO_KEEPALIVE, 1)
|
|
||||||
# MACOS doesn't have TCP_KEEPIDLE
|
|
||||||
if hasattr(socket, 'TCP_KEEPIDLE'):
|
|
||||||
self.socket.setsockopt(socket.IPPROTO_TCP, socket.TCP_KEEPIDLE, 1)
|
|
||||||
self.socket.setsockopt(socket.IPPROTO_TCP, socket.TCP_KEEPINTVL, 3)
|
|
||||||
self.socket.setsockopt(socket.IPPROTO_TCP, socket.TCP_KEEPCNT, 5)
|
|
||||||
return True
|
|
||||||
|
|
||||||
except ConnectionError as e:
|
|
||||||
LOG.error(
|
|
||||||
f'Failed to connect to {CONF.kiss_tcp.host}:{CONF.kiss_tcp.port} - {str(e)}'
|
|
||||||
)
|
|
||||||
self._connected = False
|
|
||||||
return False
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
LOG.error(
|
|
||||||
f'Failed to connect to {CONF.kiss_tcp.host}:{CONF.kiss_tcp.port} - {str(e)}'
|
|
||||||
)
|
|
||||||
self._connected = False
|
|
||||||
return False
|
|
||||||
|
|
||||||
def fix_raw_frame(self, raw_frame: bytes) -> bytes:
|
|
||||||
"""Fix the raw frame by recalculating the FCS."""
|
|
||||||
ax25_data = raw_frame[2:-1] # Remove KISS markers
|
|
||||||
return handle_fend(ax25_data)
|
|
||||||
|
|
||||||
def read_frame(self, blocking=False):
|
|
||||||
"""
|
|
||||||
Generator for complete lines, received from the server
|
|
||||||
"""
|
|
||||||
try:
|
|
||||||
self.socket.setblocking(0)
|
|
||||||
except OSError as e:
|
|
||||||
LOG.error(f'socket error when setblocking(0): {str(e)}')
|
|
||||||
raise aprslib.ConnectionDrop('connection dropped') from e
|
|
||||||
|
|
||||||
while self._running:
|
|
||||||
short_buf = b''
|
|
||||||
|
|
||||||
try:
|
|
||||||
readable, _, _ = select.select(
|
|
||||||
[self.socket],
|
|
||||||
[],
|
|
||||||
[],
|
|
||||||
self.select_timeout,
|
|
||||||
)
|
|
||||||
if not readable:
|
|
||||||
if not blocking:
|
|
||||||
break
|
|
||||||
else:
|
|
||||||
continue
|
|
||||||
except Exception as e:
|
|
||||||
LOG.error(f'Error in read loop: {e}')
|
|
||||||
self._connected = False
|
|
||||||
break
|
|
||||||
|
|
||||||
try:
|
|
||||||
print('reading from socket')
|
|
||||||
short_buf = self.socket.recv(1024)
|
|
||||||
print(f'short_buf: {short_buf}')
|
|
||||||
# sock.recv returns empty if the connection drops
|
|
||||||
if not short_buf:
|
|
||||||
if not blocking:
|
|
||||||
# We could just not be blocking, so empty is expected
|
|
||||||
continue
|
|
||||||
else:
|
|
||||||
self.logger.error('socket.recv(): returned empty')
|
|
||||||
raise aprslib.ConnectionDrop('connection dropped')
|
|
||||||
|
|
||||||
raw_frame = self.fix_raw_frame(short_buf)
|
|
||||||
return ax25frame.Frame.from_bytes(raw_frame)
|
|
||||||
except OSError as e:
|
|
||||||
# self.logger.error("socket error on recv(): %s" % str(e))
|
|
||||||
if 'Resource temporarily unavailable' in str(e):
|
|
||||||
if not blocking:
|
|
||||||
if len(short_buf) == 0:
|
|
||||||
break
|
|
||||||
except socket.timeout:
|
|
||||||
continue
|
|
||||||
except (KeyboardInterrupt, SystemExit):
|
|
||||||
raise
|
|
||||||
except ConnectionError:
|
|
||||||
self.close()
|
|
||||||
if not self.auto_reconnect:
|
|
||||||
raise
|
|
||||||
else:
|
|
||||||
self.connect()
|
|
||||||
continue
|
|
||||||
except StopIteration:
|
|
||||||
break
|
|
||||||
except IOError:
|
|
||||||
LOG.error('IOError')
|
|
||||||
break
|
|
||||||
except Exception as e:
|
|
||||||
LOG.error(f'Error in read loop: {e}')
|
|
||||||
self._connected = False
|
|
||||||
if not self.auto_reconnect:
|
|
||||||
break
|
|
91
aprsd/client/factory.py
Normal file
91
aprsd/client/factory.py
Normal file
@ -0,0 +1,91 @@
|
|||||||
|
import logging
|
||||||
|
from typing import Callable, Protocol, runtime_checkable
|
||||||
|
|
||||||
|
from aprsd import exception
|
||||||
|
from aprsd.packets import core
|
||||||
|
|
||||||
|
LOG = logging.getLogger("APRSD")
|
||||||
|
|
||||||
|
|
||||||
|
@runtime_checkable
|
||||||
|
class Client(Protocol):
|
||||||
|
def __init__(self):
|
||||||
|
pass
|
||||||
|
|
||||||
|
def connect(self) -> bool:
|
||||||
|
pass
|
||||||
|
|
||||||
|
def disconnect(self) -> bool:
|
||||||
|
pass
|
||||||
|
|
||||||
|
def decode_packet(self, *args, **kwargs) -> type[core.Packet]:
|
||||||
|
pass
|
||||||
|
|
||||||
|
def is_enabled(self) -> bool:
|
||||||
|
pass
|
||||||
|
|
||||||
|
def is_configured(self) -> bool:
|
||||||
|
pass
|
||||||
|
|
||||||
|
def transport(self) -> str:
|
||||||
|
pass
|
||||||
|
|
||||||
|
def send(self, message: str) -> bool:
|
||||||
|
pass
|
||||||
|
|
||||||
|
def setup_connection(self) -> None:
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
class ClientFactory:
|
||||||
|
_instance = None
|
||||||
|
clients = []
|
||||||
|
client = None
|
||||||
|
|
||||||
|
def __new__(cls, *args, **kwargs):
|
||||||
|
"""This magic turns this into a singleton."""
|
||||||
|
if cls._instance is None:
|
||||||
|
cls._instance = super().__new__(cls)
|
||||||
|
# Put any initialization here.
|
||||||
|
return cls._instance
|
||||||
|
|
||||||
|
def __init__(self):
|
||||||
|
self.clients: list[Callable] = []
|
||||||
|
|
||||||
|
def register(self, aprsd_client: Callable):
|
||||||
|
if isinstance(aprsd_client, Client):
|
||||||
|
raise ValueError("Client must be a subclass of Client protocol")
|
||||||
|
|
||||||
|
self.clients.append(aprsd_client)
|
||||||
|
|
||||||
|
def create(self, key=None):
|
||||||
|
for client in self.clients:
|
||||||
|
if client.is_enabled():
|
||||||
|
self.client = client()
|
||||||
|
return self.client
|
||||||
|
raise Exception("No client is configured!!")
|
||||||
|
|
||||||
|
def client_exists(self):
|
||||||
|
return bool(self.client)
|
||||||
|
|
||||||
|
def is_client_enabled(self):
|
||||||
|
"""Make sure at least one client is enabled."""
|
||||||
|
enabled = False
|
||||||
|
for client in self.clients:
|
||||||
|
if client.is_enabled():
|
||||||
|
enabled = True
|
||||||
|
return enabled
|
||||||
|
|
||||||
|
def is_client_configured(self):
|
||||||
|
enabled = False
|
||||||
|
for client in self.clients:
|
||||||
|
try:
|
||||||
|
if client.is_configured():
|
||||||
|
enabled = True
|
||||||
|
except exception.MissingConfigOptionException as ex:
|
||||||
|
LOG.error(ex.message)
|
||||||
|
return False
|
||||||
|
except exception.ConfigOptionBogusDefaultException as ex:
|
||||||
|
LOG.error(ex.message)
|
||||||
|
return False
|
||||||
|
return enabled
|
49
aprsd/client/fake.py
Normal file
49
aprsd/client/fake.py
Normal file
@ -0,0 +1,49 @@
|
|||||||
|
import logging
|
||||||
|
|
||||||
|
from oslo_config import cfg
|
||||||
|
|
||||||
|
from aprsd import client
|
||||||
|
from aprsd.client import base
|
||||||
|
from aprsd.client.drivers import fake as fake_driver
|
||||||
|
from aprsd.utils import trace
|
||||||
|
|
||||||
|
CONF = cfg.CONF
|
||||||
|
LOG = logging.getLogger("APRSD")
|
||||||
|
|
||||||
|
|
||||||
|
class APRSDFakeClient(base.APRSClient, metaclass=trace.TraceWrapperMetaclass):
|
||||||
|
def stats(self, serializable=False) -> dict:
|
||||||
|
return {
|
||||||
|
"transport": "Fake",
|
||||||
|
"connected": True,
|
||||||
|
}
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def is_enabled():
|
||||||
|
if CONF.fake_client.enabled:
|
||||||
|
return True
|
||||||
|
return False
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def is_configured():
|
||||||
|
return APRSDFakeClient.is_enabled()
|
||||||
|
|
||||||
|
def is_alive(self):
|
||||||
|
return True
|
||||||
|
|
||||||
|
def close(self):
|
||||||
|
pass
|
||||||
|
|
||||||
|
def setup_connection(self):
|
||||||
|
self.connected = True
|
||||||
|
return fake_driver.APRSDFakeClient()
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def transport():
|
||||||
|
return client.TRANSPORT_FAKE
|
||||||
|
|
||||||
|
def decode_packet(self, *args, **kwargs):
|
||||||
|
LOG.debug(f"kwargs {kwargs}")
|
||||||
|
pkt = kwargs["packet"]
|
||||||
|
LOG.debug(f"Got an APRS Fake Packet '{pkt}'")
|
||||||
|
return pkt
|
142
aprsd/client/kiss.py
Normal file
142
aprsd/client/kiss.py
Normal file
@ -0,0 +1,142 @@
|
|||||||
|
import datetime
|
||||||
|
import logging
|
||||||
|
|
||||||
|
import aprslib
|
||||||
|
import timeago
|
||||||
|
from loguru import logger
|
||||||
|
from oslo_config import cfg
|
||||||
|
|
||||||
|
from aprsd import client, exception
|
||||||
|
from aprsd.client import base
|
||||||
|
from aprsd.client.drivers import kiss
|
||||||
|
from aprsd.packets import core
|
||||||
|
|
||||||
|
CONF = cfg.CONF
|
||||||
|
LOG = logging.getLogger('APRSD')
|
||||||
|
LOGU = logger
|
||||||
|
|
||||||
|
|
||||||
|
class KISSClient(base.APRSClient):
|
||||||
|
_client = None
|
||||||
|
keepalive = datetime.datetime.now()
|
||||||
|
|
||||||
|
def stats(self, serializable=False) -> dict:
|
||||||
|
stats = {}
|
||||||
|
if self.is_configured():
|
||||||
|
keepalive = self.keepalive
|
||||||
|
if serializable:
|
||||||
|
keepalive = keepalive.isoformat()
|
||||||
|
stats = {
|
||||||
|
'connected': self.is_connected,
|
||||||
|
'connection_keepalive': keepalive,
|
||||||
|
'transport': self.transport(),
|
||||||
|
}
|
||||||
|
if self.transport() == client.TRANSPORT_TCPKISS:
|
||||||
|
stats['host'] = CONF.kiss_tcp.host
|
||||||
|
stats['port'] = CONF.kiss_tcp.port
|
||||||
|
elif self.transport() == client.TRANSPORT_SERIALKISS:
|
||||||
|
stats['device'] = CONF.kiss_serial.device
|
||||||
|
return stats
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def is_enabled():
|
||||||
|
"""Return if tcp or serial KISS is enabled."""
|
||||||
|
if CONF.kiss_serial.enabled:
|
||||||
|
return True
|
||||||
|
|
||||||
|
if CONF.kiss_tcp.enabled:
|
||||||
|
return True
|
||||||
|
|
||||||
|
return False
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def is_configured():
|
||||||
|
# Ensure that the config vars are correctly set
|
||||||
|
if KISSClient.is_enabled():
|
||||||
|
transport = KISSClient.transport()
|
||||||
|
if transport == client.TRANSPORT_SERIALKISS:
|
||||||
|
if not CONF.kiss_serial.device:
|
||||||
|
LOG.error('KISS serial enabled, but no device is set.')
|
||||||
|
raise exception.MissingConfigOptionException(
|
||||||
|
'kiss_serial.device is not set.',
|
||||||
|
)
|
||||||
|
elif transport == client.TRANSPORT_TCPKISS:
|
||||||
|
if not CONF.kiss_tcp.host:
|
||||||
|
LOG.error('KISS TCP enabled, but no host is set.')
|
||||||
|
raise exception.MissingConfigOptionException(
|
||||||
|
'kiss_tcp.host is not set.',
|
||||||
|
)
|
||||||
|
|
||||||
|
return True
|
||||||
|
return False
|
||||||
|
|
||||||
|
def is_alive(self):
|
||||||
|
if self._client:
|
||||||
|
return self._client.is_alive()
|
||||||
|
else:
|
||||||
|
return False
|
||||||
|
|
||||||
|
def close(self):
|
||||||
|
if self._client:
|
||||||
|
self._client.stop()
|
||||||
|
|
||||||
|
def keepalive_check(self):
|
||||||
|
# Don't check the first time through.
|
||||||
|
if not self.is_alive() and self._checks:
|
||||||
|
LOG.warning("Resetting client. It's not alive.")
|
||||||
|
self.reset()
|
||||||
|
self._checks = True
|
||||||
|
|
||||||
|
def keepalive_log(self):
|
||||||
|
if ka := self._client.aprsd_keepalive:
|
||||||
|
keepalive = timeago.format(ka)
|
||||||
|
else:
|
||||||
|
keepalive = 'N/A'
|
||||||
|
LOGU.opt(colors=True).info(f'<green>Client keepalive {keepalive}</green>')
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def transport():
|
||||||
|
if CONF.kiss_serial.enabled:
|
||||||
|
return client.TRANSPORT_SERIALKISS
|
||||||
|
|
||||||
|
if CONF.kiss_tcp.enabled:
|
||||||
|
return client.TRANSPORT_TCPKISS
|
||||||
|
|
||||||
|
def decode_packet(self, *args, **kwargs):
|
||||||
|
"""We get a frame, which has to be decoded."""
|
||||||
|
LOG.debug(f'kwargs {kwargs}')
|
||||||
|
frame = kwargs['frame']
|
||||||
|
LOG.debug(f"Got an APRS Frame '{frame}'")
|
||||||
|
# try and nuke the * from the fromcall sign.
|
||||||
|
# frame.header._source._ch = False
|
||||||
|
# payload = str(frame.payload.decode())
|
||||||
|
# msg = f"{str(frame.header)}:{payload}"
|
||||||
|
# msg = frame.tnc2
|
||||||
|
# LOG.debug(f"Decoding {msg}")
|
||||||
|
|
||||||
|
try:
|
||||||
|
raw = aprslib.parse(str(frame))
|
||||||
|
packet = core.factory(raw)
|
||||||
|
if isinstance(packet, core.ThirdPartyPacket):
|
||||||
|
return packet.subpacket
|
||||||
|
else:
|
||||||
|
return packet
|
||||||
|
except Exception as ex:
|
||||||
|
LOG.error(f'Error decoding packet: {ex}')
|
||||||
|
|
||||||
|
def setup_connection(self):
|
||||||
|
try:
|
||||||
|
self._client = kiss.KISS3Client()
|
||||||
|
self.connected = self.login_status['success'] = True
|
||||||
|
except Exception as ex:
|
||||||
|
self.connected = self.login_status['success'] = False
|
||||||
|
self.login_status['message'] = str(ex)
|
||||||
|
return self._client
|
||||||
|
|
||||||
|
def consumer(self, callback, blocking=False, immortal=False, raw=False):
|
||||||
|
try:
|
||||||
|
self._client.consumer(callback)
|
||||||
|
self.keepalive = datetime.datetime.now()
|
||||||
|
except Exception as ex:
|
||||||
|
LOG.error(f'Consumer failed {ex}')
|
||||||
|
LOG.error(ex)
|
@ -3,7 +3,7 @@ import threading
|
|||||||
import wrapt
|
import wrapt
|
||||||
from oslo_config import cfg
|
from oslo_config import cfg
|
||||||
|
|
||||||
from aprsd.client.client import APRSDClient
|
from aprsd import client
|
||||||
from aprsd.utils import singleton
|
from aprsd.utils import singleton
|
||||||
|
|
||||||
CONF = cfg.CONF
|
CONF = cfg.CONF
|
||||||
@ -15,4 +15,4 @@ class APRSClientStats:
|
|||||||
|
|
||||||
@wrapt.synchronized(lock)
|
@wrapt.synchronized(lock)
|
||||||
def stats(self, serializable=False):
|
def stats(self, serializable=False):
|
||||||
return APRSDClient().stats(serializable=serializable)
|
return client.client_factory.create().stats(serializable=serializable)
|
||||||
|
@ -11,6 +11,7 @@ from oslo_config import cfg
|
|||||||
from aprsd import cli_helper, conf, packets, plugin
|
from aprsd import cli_helper, conf, packets, plugin
|
||||||
|
|
||||||
# local imports here
|
# local imports here
|
||||||
|
from aprsd.client import base
|
||||||
from aprsd.main import cli
|
from aprsd.main import cli
|
||||||
from aprsd.utils import trace
|
from aprsd.utils import trace
|
||||||
|
|
||||||
@ -96,6 +97,8 @@ def test_plugin(
|
|||||||
if CONF.trace_enabled:
|
if CONF.trace_enabled:
|
||||||
trace.setup_tracing(['method', 'api'])
|
trace.setup_tracing(['method', 'api'])
|
||||||
|
|
||||||
|
base.APRSClient()
|
||||||
|
|
||||||
pm = plugin.PluginManager()
|
pm = plugin.PluginManager()
|
||||||
if load_all:
|
if load_all:
|
||||||
pm.setup_plugins(load_help_plugin=CONF.load_help_plugin)
|
pm.setup_plugins(load_help_plugin=CONF.load_help_plugin)
|
||||||
|
@ -17,13 +17,11 @@ from rich.console import Console
|
|||||||
# local imports here
|
# local imports here
|
||||||
import aprsd
|
import aprsd
|
||||||
from aprsd import cli_helper, packets, plugin, threads, utils
|
from aprsd import cli_helper, packets, plugin, threads, utils
|
||||||
from aprsd.client.client import APRSDClient
|
from aprsd.client import client_factory
|
||||||
from aprsd.main import cli
|
from aprsd.main import cli
|
||||||
from aprsd.packets import collector as packet_collector
|
from aprsd.packets import collector as packet_collector
|
||||||
from aprsd.packets import core, seen_list
|
|
||||||
from aprsd.packets import log as packet_log
|
from aprsd.packets import log as packet_log
|
||||||
from aprsd.packets.filter import PacketFilter
|
from aprsd.packets import seen_list
|
||||||
from aprsd.packets.filters import dupe_filter, packet_type
|
|
||||||
from aprsd.stats import collector
|
from aprsd.stats import collector
|
||||||
from aprsd.threads import keepalive, rx
|
from aprsd.threads import keepalive, rx
|
||||||
from aprsd.threads import stats as stats_thread
|
from aprsd.threads import stats as stats_thread
|
||||||
@ -31,7 +29,7 @@ from aprsd.threads.aprsd import APRSDThread
|
|||||||
|
|
||||||
# setup the global logger
|
# setup the global logger
|
||||||
# log.basicConfig(level=log.DEBUG) # level=10
|
# log.basicConfig(level=log.DEBUG) # level=10
|
||||||
LOG = logging.getLogger('APRSD')
|
LOG = logging.getLogger("APRSD")
|
||||||
CONF = cfg.CONF
|
CONF = cfg.CONF
|
||||||
LOGU = logger
|
LOGU = logger
|
||||||
console = Console()
|
console = Console()
|
||||||
@ -39,9 +37,9 @@ console = Console()
|
|||||||
|
|
||||||
def signal_handler(sig, frame):
|
def signal_handler(sig, frame):
|
||||||
threads.APRSDThreadList().stop_all()
|
threads.APRSDThreadList().stop_all()
|
||||||
if 'subprocess' not in str(frame):
|
if "subprocess" not in str(frame):
|
||||||
LOG.info(
|
LOG.info(
|
||||||
'Ctrl+C, Sending all threads exit! Can take up to 10 seconds {}'.format(
|
"Ctrl+C, Sending all threads exit! Can take up to 10 seconds {}".format(
|
||||||
datetime.datetime.now(),
|
datetime.datetime.now(),
|
||||||
),
|
),
|
||||||
)
|
)
|
||||||
@ -50,66 +48,90 @@ def signal_handler(sig, frame):
|
|||||||
collector.Collector().collect()
|
collector.Collector().collect()
|
||||||
|
|
||||||
|
|
||||||
class APRSDListenProcessThread(rx.APRSDFilterThread):
|
class APRSDListenThread(rx.APRSDRXThread):
|
||||||
def __init__(
|
def __init__(
|
||||||
self,
|
self,
|
||||||
packet_queue,
|
packet_queue,
|
||||||
packet_filter=None,
|
packet_filter=None,
|
||||||
plugin_manager=None,
|
plugin_manager=None,
|
||||||
enabled_plugins=None,
|
enabled_plugins=[],
|
||||||
log_packets=False,
|
log_packets=False,
|
||||||
):
|
):
|
||||||
super().__init__('ListenProcThread', packet_queue)
|
super().__init__(packet_queue)
|
||||||
self.packet_filter = packet_filter
|
self.packet_filter = packet_filter
|
||||||
self.plugin_manager = plugin_manager
|
self.plugin_manager = plugin_manager
|
||||||
if self.plugin_manager:
|
if self.plugin_manager:
|
||||||
LOG.info(f'Plugins {self.plugin_manager.get_message_plugins()}')
|
LOG.info(f"Plugins {self.plugin_manager.get_message_plugins()}")
|
||||||
self.log_packets = log_packets
|
self.log_packets = log_packets
|
||||||
|
|
||||||
def print_packet(self, packet):
|
def process_packet(self, *args, **kwargs):
|
||||||
if self.log_packets:
|
packet = self._client.decode_packet(*args, **kwargs)
|
||||||
packet_log.log(packet)
|
filters = {
|
||||||
|
packets.Packet.__name__: packets.Packet,
|
||||||
|
packets.AckPacket.__name__: packets.AckPacket,
|
||||||
|
packets.BeaconPacket.__name__: packets.BeaconPacket,
|
||||||
|
packets.GPSPacket.__name__: packets.GPSPacket,
|
||||||
|
packets.MessagePacket.__name__: packets.MessagePacket,
|
||||||
|
packets.MicEPacket.__name__: packets.MicEPacket,
|
||||||
|
packets.ObjectPacket.__name__: packets.ObjectPacket,
|
||||||
|
packets.StatusPacket.__name__: packets.StatusPacket,
|
||||||
|
packets.ThirdPartyPacket.__name__: packets.ThirdPartyPacket,
|
||||||
|
packets.WeatherPacket.__name__: packets.WeatherPacket,
|
||||||
|
packets.UnknownPacket.__name__: packets.UnknownPacket,
|
||||||
|
}
|
||||||
|
|
||||||
def process_packet(self, packet: type[core.Packet]):
|
if self.packet_filter:
|
||||||
if self.plugin_manager:
|
filter_class = filters[self.packet_filter]
|
||||||
# Don't do anything with the reply.
|
if isinstance(packet, filter_class):
|
||||||
# This is the listen only command.
|
if self.log_packets:
|
||||||
self.plugin_manager.run(packet)
|
packet_log.log(packet)
|
||||||
|
if self.plugin_manager:
|
||||||
|
# Don't do anything with the reply
|
||||||
|
# This is the listen only command.
|
||||||
|
self.plugin_manager.run(packet)
|
||||||
|
else:
|
||||||
|
if self.log_packets:
|
||||||
|
packet_log.log(packet)
|
||||||
|
if self.plugin_manager:
|
||||||
|
# Don't do anything with the reply.
|
||||||
|
# This is the listen only command.
|
||||||
|
self.plugin_manager.run(packet)
|
||||||
|
|
||||||
|
packet_collector.PacketCollector().rx(packet)
|
||||||
|
|
||||||
|
|
||||||
class ListenStatsThread(APRSDThread):
|
class ListenStatsThread(APRSDThread):
|
||||||
"""Log the stats from the PacketList."""
|
"""Log the stats from the PacketList."""
|
||||||
|
|
||||||
def __init__(self):
|
def __init__(self):
|
||||||
super().__init__('PacketStatsLog')
|
super().__init__("PacketStatsLog")
|
||||||
self._last_total_rx = 0
|
self._last_total_rx = 0
|
||||||
self.period = 31
|
|
||||||
|
|
||||||
def loop(self):
|
def loop(self):
|
||||||
if self.loop_count % self.period == 0:
|
if self.loop_count % 10 == 0:
|
||||||
# log the stats every 10 seconds
|
# log the stats every 10 seconds
|
||||||
stats_json = collector.Collector().collect()
|
stats_json = collector.Collector().collect()
|
||||||
stats = stats_json['PacketList']
|
stats = stats_json["PacketList"]
|
||||||
total_rx = stats['rx']
|
total_rx = stats["rx"]
|
||||||
packet_count = len(stats['packets'])
|
packet_count = len(stats["packets"])
|
||||||
rx_delta = total_rx - self._last_total_rx
|
rx_delta = total_rx - self._last_total_rx
|
||||||
rate = rx_delta / self.period
|
rate = rx_delta / 10
|
||||||
|
|
||||||
# Log summary stats
|
# Log summary stats
|
||||||
LOGU.opt(colors=True).info(
|
LOGU.opt(colors=True).info(
|
||||||
f'<green>RX Rate: {rate:.2f} pps</green> '
|
f"<green>RX Rate: {rate} pps</green> "
|
||||||
f'<yellow>Total RX: {total_rx}</yellow> '
|
f"<yellow>Total RX: {total_rx}</yellow> "
|
||||||
f'<red>RX Last {self.period} secs: {rx_delta}</red> '
|
f"<red>RX Last 10 secs: {rx_delta}</red> "
|
||||||
f'<white>Packets in PacketListStats: {packet_count}</white>',
|
f"<white>Packets in PacketList: {packet_count}</white>",
|
||||||
)
|
)
|
||||||
self._last_total_rx = total_rx
|
self._last_total_rx = total_rx
|
||||||
|
|
||||||
# Log individual type stats
|
# Log individual type stats
|
||||||
for k, v in stats['types'].items():
|
for k, v in stats["types"].items():
|
||||||
thread_hex = f'fg {utils.hex_from_name(k)}'
|
thread_hex = f"fg {utils.hex_from_name(k)}"
|
||||||
LOGU.opt(colors=True).info(
|
LOGU.opt(colors=True).info(
|
||||||
f'<{thread_hex}>{k:<15}</{thread_hex}> '
|
f"<{thread_hex}>{k:<15}</{thread_hex}> "
|
||||||
f'<blue>RX: {v["rx"]}</blue> <red>TX: {v["tx"]}</red>',
|
f"<blue>RX: {v['rx']}</blue> <red>TX: {v['tx']}</red>",
|
||||||
)
|
)
|
||||||
|
|
||||||
time.sleep(1)
|
time.sleep(1)
|
||||||
@ -119,19 +141,19 @@ class ListenStatsThread(APRSDThread):
|
|||||||
@cli.command()
|
@cli.command()
|
||||||
@cli_helper.add_options(cli_helper.common_options)
|
@cli_helper.add_options(cli_helper.common_options)
|
||||||
@click.option(
|
@click.option(
|
||||||
'--aprs-login',
|
"--aprs-login",
|
||||||
envvar='APRS_LOGIN',
|
envvar="APRS_LOGIN",
|
||||||
show_envvar=True,
|
show_envvar=True,
|
||||||
help='What callsign to send the message from.',
|
help="What callsign to send the message from.",
|
||||||
)
|
)
|
||||||
@click.option(
|
@click.option(
|
||||||
'--aprs-password',
|
"--aprs-password",
|
||||||
envvar='APRS_PASSWORD',
|
envvar="APRS_PASSWORD",
|
||||||
show_envvar=True,
|
show_envvar=True,
|
||||||
help='the APRS-IS password for APRS_LOGIN',
|
help="the APRS-IS password for APRS_LOGIN",
|
||||||
)
|
)
|
||||||
@click.option(
|
@click.option(
|
||||||
'--packet-filter',
|
"--packet-filter",
|
||||||
type=click.Choice(
|
type=click.Choice(
|
||||||
[
|
[
|
||||||
packets.AckPacket.__name__,
|
packets.AckPacket.__name__,
|
||||||
@ -148,37 +170,35 @@ class ListenStatsThread(APRSDThread):
|
|||||||
],
|
],
|
||||||
case_sensitive=False,
|
case_sensitive=False,
|
||||||
),
|
),
|
||||||
multiple=True,
|
help="Filter by packet type",
|
||||||
default=[],
|
|
||||||
help='Filter by packet type',
|
|
||||||
)
|
)
|
||||||
@click.option(
|
@click.option(
|
||||||
'--enable-plugin',
|
"--enable-plugin",
|
||||||
multiple=True,
|
multiple=True,
|
||||||
help='Enable a plugin. This is the name of the file in the plugins directory.',
|
help="Enable a plugin. This is the name of the file in the plugins directory.",
|
||||||
)
|
)
|
||||||
@click.option(
|
@click.option(
|
||||||
'--load-plugins',
|
"--load-plugins",
|
||||||
default=False,
|
default=False,
|
||||||
is_flag=True,
|
is_flag=True,
|
||||||
help='Load plugins as enabled in aprsd.conf ?',
|
help="Load plugins as enabled in aprsd.conf ?",
|
||||||
)
|
)
|
||||||
@click.argument(
|
@click.argument(
|
||||||
'filter',
|
"filter",
|
||||||
nargs=-1,
|
nargs=-1,
|
||||||
required=True,
|
required=True,
|
||||||
)
|
)
|
||||||
@click.option(
|
@click.option(
|
||||||
'--log-packets',
|
"--log-packets",
|
||||||
default=False,
|
default=False,
|
||||||
is_flag=True,
|
is_flag=True,
|
||||||
help='Log incoming packets.',
|
help="Log incoming packets.",
|
||||||
)
|
)
|
||||||
@click.option(
|
@click.option(
|
||||||
'--enable-packet-stats',
|
"--enable-packet-stats",
|
||||||
default=False,
|
default=False,
|
||||||
is_flag=True,
|
is_flag=True,
|
||||||
help='Enable packet stats periodic logging.',
|
help="Enable packet stats periodic logging.",
|
||||||
)
|
)
|
||||||
@click.pass_context
|
@click.pass_context
|
||||||
@cli_helper.process_standard_options
|
@cli_helper.process_standard_options
|
||||||
@ -208,46 +228,46 @@ def listen(
|
|||||||
|
|
||||||
if not aprs_login:
|
if not aprs_login:
|
||||||
click.echo(ctx.get_help())
|
click.echo(ctx.get_help())
|
||||||
click.echo('')
|
click.echo("")
|
||||||
ctx.fail('Must set --aprs-login or APRS_LOGIN')
|
ctx.fail("Must set --aprs-login or APRS_LOGIN")
|
||||||
ctx.exit()
|
ctx.exit()
|
||||||
|
|
||||||
if not aprs_password:
|
if not aprs_password:
|
||||||
click.echo(ctx.get_help())
|
click.echo(ctx.get_help())
|
||||||
click.echo('')
|
click.echo("")
|
||||||
ctx.fail('Must set --aprs-password or APRS_PASSWORD')
|
ctx.fail("Must set --aprs-password or APRS_PASSWORD")
|
||||||
ctx.exit()
|
ctx.exit()
|
||||||
|
|
||||||
# CONF.aprs_network.login = aprs_login
|
# CONF.aprs_network.login = aprs_login
|
||||||
# config["aprs"]["password"] = aprs_password
|
# config["aprs"]["password"] = aprs_password
|
||||||
|
|
||||||
LOG.info(f'APRSD Listen Started version: {aprsd.__version__}')
|
LOG.info(f"APRSD Listen Started version: {aprsd.__version__}")
|
||||||
|
|
||||||
CONF.log_opt_values(LOG, logging.DEBUG)
|
CONF.log_opt_values(LOG, logging.DEBUG)
|
||||||
collector.Collector()
|
collector.Collector()
|
||||||
|
|
||||||
# Try and load saved MsgTrack list
|
# Try and load saved MsgTrack list
|
||||||
LOG.debug('Loading saved MsgTrack object.')
|
LOG.debug("Loading saved MsgTrack object.")
|
||||||
|
|
||||||
# Initialize the client factory and create
|
# Initialize the client factory and create
|
||||||
# The correct client object ready for use
|
# The correct client object ready for use
|
||||||
# Make sure we have 1 client transport enabled
|
# Make sure we have 1 client transport enabled
|
||||||
if not APRSDClient().is_enabled:
|
if not client_factory.is_client_enabled():
|
||||||
LOG.error('No Clients are enabled in config.')
|
LOG.error("No Clients are enabled in config.")
|
||||||
sys.exit(-1)
|
sys.exit(-1)
|
||||||
|
|
||||||
# Creates the client object
|
# Creates the client object
|
||||||
LOG.info('Creating client connection')
|
LOG.info("Creating client connection")
|
||||||
aprs_client = APRSDClient()
|
aprs_client = client_factory.create()
|
||||||
LOG.info(aprs_client)
|
LOG.info(aprs_client)
|
||||||
if not aprs_client.login_success:
|
if not aprs_client.login_success:
|
||||||
# We failed to login, will just quit!
|
# We failed to login, will just quit!
|
||||||
msg = f'Login Failure: {aprs_client.login_failure}'
|
msg = f"Login Failure: {aprs_client.login_failure}"
|
||||||
LOG.error(msg)
|
LOG.error(msg)
|
||||||
print(msg)
|
print(msg)
|
||||||
sys.exit(-1)
|
sys.exit(-1)
|
||||||
|
|
||||||
LOG.debug(f"Filter messages on aprsis server by '{filter}'")
|
LOG.debug(f"Filter by '{filter}'")
|
||||||
aprs_client.set_filter(filter)
|
aprs_client.set_filter(filter)
|
||||||
|
|
||||||
keepalive_thread = keepalive.KeepAliveThread()
|
keepalive_thread = keepalive.KeepAliveThread()
|
||||||
@ -256,19 +276,10 @@ def listen(
|
|||||||
# just deregister the class from the packet collector
|
# just deregister the class from the packet collector
|
||||||
packet_collector.PacketCollector().unregister(seen_list.SeenList)
|
packet_collector.PacketCollector().unregister(seen_list.SeenList)
|
||||||
|
|
||||||
# we don't want the dupe filter to run here.
|
|
||||||
PacketFilter().unregister(dupe_filter.DupePacketFilter)
|
|
||||||
if packet_filter:
|
|
||||||
LOG.info('Enabling packet filtering for {packet_filter}')
|
|
||||||
packet_type.PacketTypeFilter().set_allow_list(packet_filter)
|
|
||||||
PacketFilter().register(packet_type.PacketTypeFilter)
|
|
||||||
else:
|
|
||||||
LOG.info('No packet filtering enabled.')
|
|
||||||
|
|
||||||
pm = None
|
pm = None
|
||||||
if load_plugins:
|
if load_plugins:
|
||||||
pm = plugin.PluginManager()
|
pm = plugin.PluginManager()
|
||||||
LOG.info('Loading plugins')
|
LOG.info("Loading plugins")
|
||||||
pm.setup_plugins(load_help_plugin=False)
|
pm.setup_plugins(load_help_plugin=False)
|
||||||
elif enable_plugin:
|
elif enable_plugin:
|
||||||
pm = plugin.PluginManager()
|
pm = plugin.PluginManager()
|
||||||
@ -279,37 +290,33 @@ def listen(
|
|||||||
else:
|
else:
|
||||||
LOG.warning(
|
LOG.warning(
|
||||||
"Not Loading any plugins use --load-plugins to load what's "
|
"Not Loading any plugins use --load-plugins to load what's "
|
||||||
'defined in the config file.',
|
"defined in the config file.",
|
||||||
)
|
)
|
||||||
|
|
||||||
if pm:
|
if pm:
|
||||||
for p in pm.get_plugins():
|
for p in pm.get_plugins():
|
||||||
LOG.info('Loaded plugin %s', p.__class__.__name__)
|
LOG.info("Loaded plugin %s", p.__class__.__name__)
|
||||||
|
|
||||||
stats = stats_thread.APRSDStatsStoreThread()
|
stats = stats_thread.APRSDStatsStoreThread()
|
||||||
stats.start()
|
stats.start()
|
||||||
|
|
||||||
LOG.debug('Start APRSDRxThread')
|
LOG.debug("Create APRSDListenThread")
|
||||||
rx_thread = rx.APRSDRXThread(packet_queue=threads.packet_queue)
|
listen_thread = APRSDListenThread(
|
||||||
rx_thread.start()
|
|
||||||
|
|
||||||
LOG.debug('Create APRSDListenProcessThread')
|
|
||||||
listen_thread = APRSDListenProcessThread(
|
|
||||||
packet_queue=threads.packet_queue,
|
packet_queue=threads.packet_queue,
|
||||||
packet_filter=packet_filter,
|
packet_filter=packet_filter,
|
||||||
plugin_manager=pm,
|
plugin_manager=pm,
|
||||||
enabled_plugins=enable_plugin,
|
enabled_plugins=enable_plugin,
|
||||||
log_packets=log_packets,
|
log_packets=log_packets,
|
||||||
)
|
)
|
||||||
LOG.debug('Start APRSDListenProcessThread')
|
LOG.debug("Start APRSDListenThread")
|
||||||
listen_thread.start()
|
listen_thread.start()
|
||||||
if enable_packet_stats:
|
if enable_packet_stats:
|
||||||
listen_stats = ListenStatsThread()
|
listen_stats = ListenStatsThread()
|
||||||
listen_stats.start()
|
listen_stats.start()
|
||||||
|
|
||||||
keepalive_thread.start()
|
keepalive_thread.start()
|
||||||
LOG.debug('keepalive Join')
|
LOG.debug("keepalive Join")
|
||||||
keepalive_thread.join()
|
keepalive_thread.join()
|
||||||
rx_thread.join()
|
LOG.debug("listen_thread Join")
|
||||||
listen_thread.join()
|
listen_thread.join()
|
||||||
stats.join()
|
stats.join()
|
||||||
|
@ -3,60 +3,58 @@ import sys
|
|||||||
import time
|
import time
|
||||||
|
|
||||||
import aprslib
|
import aprslib
|
||||||
import click
|
|
||||||
from aprslib.exceptions import LoginError
|
from aprslib.exceptions import LoginError
|
||||||
|
import click
|
||||||
from oslo_config import cfg
|
from oslo_config import cfg
|
||||||
|
|
||||||
import aprsd
|
import aprsd
|
||||||
import aprsd.packets # noqa : F401
|
from aprsd import cli_helper, packets
|
||||||
from aprsd import (
|
from aprsd import conf # noqa : F401
|
||||||
cli_helper,
|
from aprsd.client import client_factory
|
||||||
conf, # noqa : F401
|
|
||||||
packets,
|
|
||||||
)
|
|
||||||
from aprsd.client.client import APRSDClient
|
|
||||||
from aprsd.main import cli
|
from aprsd.main import cli
|
||||||
|
import aprsd.packets # noqa : F401
|
||||||
from aprsd.packets import collector
|
from aprsd.packets import collector
|
||||||
from aprsd.packets import log as packet_log
|
from aprsd.packets import log as packet_log
|
||||||
from aprsd.threads import tx
|
from aprsd.threads import tx
|
||||||
|
|
||||||
|
|
||||||
CONF = cfg.CONF
|
CONF = cfg.CONF
|
||||||
LOG = logging.getLogger('APRSD')
|
LOG = logging.getLogger("APRSD")
|
||||||
|
|
||||||
|
|
||||||
@cli.command()
|
@cli.command()
|
||||||
@cli_helper.add_options(cli_helper.common_options)
|
@cli_helper.add_options(cli_helper.common_options)
|
||||||
@click.option(
|
@click.option(
|
||||||
'--aprs-login',
|
"--aprs-login",
|
||||||
envvar='APRS_LOGIN',
|
envvar="APRS_LOGIN",
|
||||||
show_envvar=True,
|
show_envvar=True,
|
||||||
help='What callsign to send the message from. Defaults to config entry.',
|
help="What callsign to send the message from. Defaults to config entry.",
|
||||||
)
|
)
|
||||||
@click.option(
|
@click.option(
|
||||||
'--aprs-password',
|
"--aprs-password",
|
||||||
envvar='APRS_PASSWORD',
|
envvar="APRS_PASSWORD",
|
||||||
show_envvar=True,
|
show_envvar=True,
|
||||||
help='the APRS-IS password for APRS_LOGIN. Defaults to config entry.',
|
help="the APRS-IS password for APRS_LOGIN. Defaults to config entry.",
|
||||||
)
|
)
|
||||||
@click.option(
|
@click.option(
|
||||||
'--no-ack',
|
"--no-ack",
|
||||||
'-n',
|
"-n",
|
||||||
is_flag=True,
|
is_flag=True,
|
||||||
show_default=True,
|
show_default=True,
|
||||||
default=False,
|
default=False,
|
||||||
help="Don't wait for an ack, just sent it to APRS-IS and bail.",
|
help="Don't wait for an ack, just sent it to APRS-IS and bail.",
|
||||||
)
|
)
|
||||||
@click.option(
|
@click.option(
|
||||||
'--wait-response',
|
"--wait-response",
|
||||||
'-w',
|
"-w",
|
||||||
is_flag=True,
|
is_flag=True,
|
||||||
show_default=True,
|
show_default=True,
|
||||||
default=False,
|
default=False,
|
||||||
help='Wait for a response to the message?',
|
help="Wait for a response to the message?",
|
||||||
)
|
)
|
||||||
@click.option('--raw', default=None, help='Send a raw message. Implies --no-ack')
|
@click.option("--raw", default=None, help="Send a raw message. Implies --no-ack")
|
||||||
@click.argument('tocallsign', required=True)
|
@click.argument("tocallsign", required=True)
|
||||||
@click.argument('command', nargs=-1, required=True)
|
@click.argument("command", nargs=-1, required=True)
|
||||||
@click.pass_context
|
@click.pass_context
|
||||||
@cli_helper.process_standard_options
|
@cli_helper.process_standard_options
|
||||||
def send_message(
|
def send_message(
|
||||||
@ -71,11 +69,11 @@ def send_message(
|
|||||||
):
|
):
|
||||||
"""Send a message to a callsign via APRS_IS."""
|
"""Send a message to a callsign via APRS_IS."""
|
||||||
global got_ack, got_response
|
global got_ack, got_response
|
||||||
quiet = ctx.obj['quiet']
|
quiet = ctx.obj["quiet"]
|
||||||
|
|
||||||
if not aprs_login:
|
if not aprs_login:
|
||||||
if CONF.aprs_network.login == conf.client.DEFAULT_LOGIN:
|
if CONF.aprs_network.login == conf.client.DEFAULT_LOGIN:
|
||||||
click.echo('Must set --aprs_login or APRS_LOGIN')
|
click.echo("Must set --aprs_login or APRS_LOGIN")
|
||||||
ctx.exit(-1)
|
ctx.exit(-1)
|
||||||
return
|
return
|
||||||
else:
|
else:
|
||||||
@ -83,15 +81,15 @@ def send_message(
|
|||||||
|
|
||||||
if not aprs_password:
|
if not aprs_password:
|
||||||
if not CONF.aprs_network.password:
|
if not CONF.aprs_network.password:
|
||||||
click.echo('Must set --aprs-password or APRS_PASSWORD')
|
click.echo("Must set --aprs-password or APRS_PASSWORD")
|
||||||
ctx.exit(-1)
|
ctx.exit(-1)
|
||||||
return
|
return
|
||||||
else:
|
else:
|
||||||
aprs_password = CONF.aprs_network.password
|
aprs_password = CONF.aprs_network.password
|
||||||
|
|
||||||
LOG.info(f'APRSD LISTEN Started version: {aprsd.__version__}')
|
LOG.info(f"APRSD LISTEN Started version: {aprsd.__version__}")
|
||||||
if type(command) is tuple:
|
if type(command) is tuple:
|
||||||
command = ' '.join(command)
|
command = " ".join(command)
|
||||||
if not quiet:
|
if not quiet:
|
||||||
if raw:
|
if raw:
|
||||||
LOG.info(f"L'{aprs_login}' R'{raw}'")
|
LOG.info(f"L'{aprs_login}' R'{raw}'")
|
||||||
@ -103,7 +101,7 @@ def send_message(
|
|||||||
|
|
||||||
def rx_packet(packet):
|
def rx_packet(packet):
|
||||||
global got_ack, got_response
|
global got_ack, got_response
|
||||||
cl = APRSDClient()
|
cl = client_factory.create()
|
||||||
packet = cl.decode_packet(packet)
|
packet = cl.decode_packet(packet)
|
||||||
collector.PacketCollector().rx(packet)
|
collector.PacketCollector().rx(packet)
|
||||||
packet_log.log(packet, tx=False)
|
packet_log.log(packet, tx=False)
|
||||||
@ -131,7 +129,7 @@ def send_message(
|
|||||||
sys.exit(0)
|
sys.exit(0)
|
||||||
|
|
||||||
try:
|
try:
|
||||||
APRSDClient().client # noqa: B018
|
client_factory.create().client
|
||||||
except LoginError:
|
except LoginError:
|
||||||
sys.exit(-1)
|
sys.exit(-1)
|
||||||
|
|
||||||
@ -142,7 +140,7 @@ def send_message(
|
|||||||
# message
|
# message
|
||||||
if raw:
|
if raw:
|
||||||
tx.send(
|
tx.send(
|
||||||
packets.Packet(from_call='', to_call='', raw=raw),
|
packets.Packet(from_call="", to_call="", raw=raw),
|
||||||
direct=True,
|
direct=True,
|
||||||
)
|
)
|
||||||
sys.exit(0)
|
sys.exit(0)
|
||||||
@ -163,10 +161,10 @@ def send_message(
|
|||||||
# This will register a packet consumer with aprslib
|
# This will register a packet consumer with aprslib
|
||||||
# When new packets come in the consumer will process
|
# When new packets come in the consumer will process
|
||||||
# the packet
|
# the packet
|
||||||
aprs_client = APRSDClient()
|
aprs_client = client_factory.create().client
|
||||||
aprs_client.consumer(rx_packet, raw=False)
|
aprs_client.consumer(rx_packet, raw=False)
|
||||||
except aprslib.exceptions.ConnectionDrop:
|
except aprslib.exceptions.ConnectionDrop:
|
||||||
LOG.error('Connection dropped, reconnecting')
|
LOG.error("Connection dropped, reconnecting")
|
||||||
time.sleep(5)
|
time.sleep(5)
|
||||||
# Force the deletion of the client object connected to aprs
|
# Force the deletion of the client object connected to aprs
|
||||||
# This will cause a reconnect, next time client.get_client()
|
# This will cause a reconnect, next time client.get_client()
|
||||||
|
@ -8,28 +8,63 @@ from oslo_config import cfg
|
|||||||
import aprsd
|
import aprsd
|
||||||
from aprsd import cli_helper, plugin, threads, utils
|
from aprsd import cli_helper, plugin, threads, utils
|
||||||
from aprsd import main as aprsd_main
|
from aprsd import main as aprsd_main
|
||||||
from aprsd.client.client import APRSDClient
|
from aprsd.client import client_factory
|
||||||
from aprsd.main import cli
|
from aprsd.main import cli
|
||||||
from aprsd.packets import collector as packet_collector
|
from aprsd.packets import collector as packet_collector
|
||||||
from aprsd.packets import seen_list
|
from aprsd.packets import seen_list
|
||||||
from aprsd.threads import keepalive, registry, rx, service, tx
|
from aprsd.threads import aprsd as aprsd_threads
|
||||||
|
from aprsd.threads import keepalive, registry, rx, tx
|
||||||
from aprsd.threads import stats as stats_thread
|
from aprsd.threads import stats as stats_thread
|
||||||
|
from aprsd.utils import singleton
|
||||||
|
|
||||||
CONF = cfg.CONF
|
CONF = cfg.CONF
|
||||||
LOG = logging.getLogger('APRSD')
|
LOG = logging.getLogger("APRSD")
|
||||||
|
|
||||||
|
|
||||||
|
@singleton
|
||||||
|
class ServerThreads:
|
||||||
|
"""Registry for threads that the server command runs.
|
||||||
|
|
||||||
|
This enables extensions to register a thread to run during
|
||||||
|
the server command.
|
||||||
|
|
||||||
|
"""
|
||||||
|
|
||||||
|
def __init__(self):
|
||||||
|
self.threads: list[aprsd_threads.APRSDThread] = []
|
||||||
|
|
||||||
|
def register(self, thread: aprsd_threads.APRSDThread):
|
||||||
|
if not isinstance(thread, aprsd_threads.APRSDThread):
|
||||||
|
raise TypeError(f"Thread {thread} is not an APRSDThread")
|
||||||
|
self.threads.append(thread)
|
||||||
|
|
||||||
|
def unregister(self, thread: aprsd_threads.APRSDThread):
|
||||||
|
if not isinstance(thread, aprsd_threads.APRSDThread):
|
||||||
|
raise TypeError(f"Thread {thread} is not an APRSDThread")
|
||||||
|
self.threads.remove(thread)
|
||||||
|
|
||||||
|
def start(self):
|
||||||
|
"""Start all threads in the list."""
|
||||||
|
for thread in self.threads:
|
||||||
|
thread.start()
|
||||||
|
|
||||||
|
def join(self):
|
||||||
|
"""Join all the threads in the list"""
|
||||||
|
for thread in self.threads:
|
||||||
|
thread.join()
|
||||||
|
|
||||||
|
|
||||||
# main() ###
|
# main() ###
|
||||||
@cli.command()
|
@cli.command()
|
||||||
@cli_helper.add_options(cli_helper.common_options)
|
@cli_helper.add_options(cli_helper.common_options)
|
||||||
@click.option(
|
@click.option(
|
||||||
'-f',
|
"-f",
|
||||||
'--flush',
|
"--flush",
|
||||||
'flush',
|
"flush",
|
||||||
is_flag=True,
|
is_flag=True,
|
||||||
show_default=True,
|
show_default=True,
|
||||||
default=False,
|
default=False,
|
||||||
help='Flush out all old aged messages on disk.',
|
help="Flush out all old aged messages on disk.",
|
||||||
)
|
)
|
||||||
@click.pass_context
|
@click.pass_context
|
||||||
@cli_helper.process_standard_options
|
@cli_helper.process_standard_options
|
||||||
@ -38,31 +73,37 @@ def server(ctx, flush):
|
|||||||
signal.signal(signal.SIGINT, aprsd_main.signal_handler)
|
signal.signal(signal.SIGINT, aprsd_main.signal_handler)
|
||||||
signal.signal(signal.SIGTERM, aprsd_main.signal_handler)
|
signal.signal(signal.SIGTERM, aprsd_main.signal_handler)
|
||||||
|
|
||||||
service_threads = service.ServiceThreads()
|
server_threads = ServerThreads()
|
||||||
|
|
||||||
level, msg = utils._check_version()
|
level, msg = utils._check_version()
|
||||||
if level:
|
if level:
|
||||||
LOG.warning(msg)
|
LOG.warning(msg)
|
||||||
else:
|
else:
|
||||||
LOG.info(msg)
|
LOG.info(msg)
|
||||||
LOG.info(f'APRSD Started version: {aprsd.__version__}')
|
LOG.info(f"APRSD Started version: {aprsd.__version__}")
|
||||||
|
|
||||||
# Make sure we have 1 client transport enabled
|
# Initialize the client factory and create
|
||||||
if not APRSDClient().is_enabled:
|
# The correct client object ready for use
|
||||||
LOG.error('No Clients are enabled in config.')
|
if not client_factory.is_client_enabled():
|
||||||
|
LOG.error("No Clients are enabled in config.")
|
||||||
sys.exit(-1)
|
sys.exit(-1)
|
||||||
|
|
||||||
if not APRSDClient().is_configured:
|
# Make sure we have 1 client transport enabled
|
||||||
LOG.error('APRS client is not properly configured in config file.')
|
if not client_factory.is_client_enabled():
|
||||||
|
LOG.error("No Clients are enabled in config.")
|
||||||
|
sys.exit(-1)
|
||||||
|
|
||||||
|
if not client_factory.is_client_configured():
|
||||||
|
LOG.error("APRS client is not properly configured in config file.")
|
||||||
sys.exit(-1)
|
sys.exit(-1)
|
||||||
|
|
||||||
# Creates the client object
|
# Creates the client object
|
||||||
LOG.info('Creating client connection')
|
LOG.info("Creating client connection")
|
||||||
aprs_client = APRSDClient()
|
aprs_client = client_factory.create()
|
||||||
LOG.info(aprs_client)
|
LOG.info(aprs_client)
|
||||||
if not aprs_client.login_success:
|
if not aprs_client.login_success:
|
||||||
# We failed to login, will just quit!
|
# We failed to login, will just quit!
|
||||||
msg = f'Login Failure: {aprs_client.login_failure}'
|
msg = f"Login Failure: {aprs_client.login_failure}"
|
||||||
LOG.error(msg)
|
LOG.error(msg)
|
||||||
print(msg)
|
print(msg)
|
||||||
sys.exit(-1)
|
sys.exit(-1)
|
||||||
@ -73,7 +114,7 @@ def server(ctx, flush):
|
|||||||
# We register plugins first here so we can register each
|
# We register plugins first here so we can register each
|
||||||
# plugins config options, so we can dump them all in the
|
# plugins config options, so we can dump them all in the
|
||||||
# log file output.
|
# log file output.
|
||||||
LOG.info('Loading Plugin Manager and registering plugins')
|
LOG.info("Loading Plugin Manager and registering plugins")
|
||||||
plugin_manager = plugin.PluginManager()
|
plugin_manager = plugin.PluginManager()
|
||||||
plugin_manager.setup_plugins(load_help_plugin=CONF.load_help_plugin)
|
plugin_manager.setup_plugins(load_help_plugin=CONF.load_help_plugin)
|
||||||
|
|
||||||
@ -81,10 +122,10 @@ def server(ctx, flush):
|
|||||||
CONF.log_opt_values(LOG, logging.DEBUG)
|
CONF.log_opt_values(LOG, logging.DEBUG)
|
||||||
message_plugins = plugin_manager.get_message_plugins()
|
message_plugins = plugin_manager.get_message_plugins()
|
||||||
watchlist_plugins = plugin_manager.get_watchlist_plugins()
|
watchlist_plugins = plugin_manager.get_watchlist_plugins()
|
||||||
LOG.info('Message Plugins enabled and running:')
|
LOG.info("Message Plugins enabled and running:")
|
||||||
for p in message_plugins:
|
for p in message_plugins:
|
||||||
LOG.info(p)
|
LOG.info(p)
|
||||||
LOG.info('Watchlist Plugins enabled and running:')
|
LOG.info("Watchlist Plugins enabled and running:")
|
||||||
for p in watchlist_plugins:
|
for p in watchlist_plugins:
|
||||||
LOG.info(p)
|
LOG.info(p)
|
||||||
|
|
||||||
@ -94,37 +135,37 @@ def server(ctx, flush):
|
|||||||
|
|
||||||
# Now load the msgTrack from disk if any
|
# Now load the msgTrack from disk if any
|
||||||
if flush:
|
if flush:
|
||||||
LOG.debug('Flushing All packet tracking objects.')
|
LOG.debug("Flushing All packet tracking objects.")
|
||||||
packet_collector.PacketCollector().flush()
|
packet_collector.PacketCollector().flush()
|
||||||
else:
|
else:
|
||||||
# Try and load saved MsgTrack list
|
# Try and load saved MsgTrack list
|
||||||
LOG.debug('Loading saved packet tracking data.')
|
LOG.debug("Loading saved packet tracking data.")
|
||||||
packet_collector.PacketCollector().load()
|
packet_collector.PacketCollector().load()
|
||||||
|
|
||||||
# Now start all the main processing threads.
|
# Now start all the main processing threads.
|
||||||
|
|
||||||
service_threads.register(keepalive.KeepAliveThread())
|
server_threads.register(keepalive.KeepAliveThread())
|
||||||
service_threads.register(stats_thread.APRSDStatsStoreThread())
|
server_threads.register(stats_thread.APRSDStatsStoreThread())
|
||||||
service_threads.register(
|
server_threads.register(
|
||||||
rx.APRSDRXThread(
|
rx.APRSDPluginRXThread(
|
||||||
packet_queue=threads.packet_queue,
|
packet_queue=threads.packet_queue,
|
||||||
),
|
),
|
||||||
)
|
)
|
||||||
service_threads.register(
|
server_threads.register(
|
||||||
rx.APRSDPluginProcessPacketThread(
|
rx.APRSDPluginProcessPacketThread(
|
||||||
packet_queue=threads.packet_queue,
|
packet_queue=threads.packet_queue,
|
||||||
),
|
),
|
||||||
)
|
)
|
||||||
|
|
||||||
if CONF.enable_beacon:
|
if CONF.enable_beacon:
|
||||||
LOG.info('Beacon Enabled. Starting Beacon thread.')
|
LOG.info("Beacon Enabled. Starting Beacon thread.")
|
||||||
service_threads.register(tx.BeaconSendThread())
|
server_threads.register(tx.BeaconSendThread())
|
||||||
|
|
||||||
if CONF.aprs_registry.enabled:
|
if CONF.aprs_registry.enabled:
|
||||||
LOG.info('Registry Enabled. Starting Registry thread.')
|
LOG.info("Registry Enabled. Starting Registry thread.")
|
||||||
service_threads.register(registry.APRSRegistryThread())
|
server_threads.register(registry.APRSRegistryThread())
|
||||||
|
|
||||||
service_threads.start()
|
server_threads.start()
|
||||||
service_threads.join()
|
server_threads.join()
|
||||||
|
|
||||||
return 0
|
return 0
|
||||||
|
@ -3,219 +3,220 @@ from pathlib import Path
|
|||||||
from oslo_config import cfg
|
from oslo_config import cfg
|
||||||
|
|
||||||
home = str(Path.home())
|
home = str(Path.home())
|
||||||
DEFAULT_CONFIG_DIR = f'{home}/.config/aprsd/'
|
DEFAULT_CONFIG_DIR = f"{home}/.config/aprsd/"
|
||||||
APRSD_DEFAULT_MAGIC_WORD = 'CHANGEME!!!'
|
APRSD_DEFAULT_MAGIC_WORD = "CHANGEME!!!"
|
||||||
|
|
||||||
watch_list_group = cfg.OptGroup(
|
watch_list_group = cfg.OptGroup(
|
||||||
name='watch_list',
|
name="watch_list",
|
||||||
title='Watch List settings',
|
title="Watch List settings",
|
||||||
)
|
)
|
||||||
|
|
||||||
registry_group = cfg.OptGroup(
|
registry_group = cfg.OptGroup(
|
||||||
name='aprs_registry',
|
name="aprs_registry",
|
||||||
title='APRS Registry settings',
|
title="APRS Registry settings",
|
||||||
)
|
)
|
||||||
|
|
||||||
aprsd_opts = [
|
aprsd_opts = [
|
||||||
cfg.StrOpt(
|
cfg.StrOpt(
|
||||||
'callsign',
|
"callsign",
|
||||||
required=True,
|
required=True,
|
||||||
help='Callsign to use for messages sent by APRSD',
|
help="Callsign to use for messages sent by APRSD",
|
||||||
),
|
),
|
||||||
cfg.BoolOpt(
|
cfg.BoolOpt(
|
||||||
'enable_save',
|
"enable_save",
|
||||||
default=True,
|
default=True,
|
||||||
help='Enable saving of watch list, packet tracker between restarts.',
|
help="Enable saving of watch list, packet tracker between restarts.",
|
||||||
),
|
),
|
||||||
cfg.StrOpt(
|
cfg.StrOpt(
|
||||||
'save_location',
|
"save_location",
|
||||||
default=DEFAULT_CONFIG_DIR,
|
default=DEFAULT_CONFIG_DIR,
|
||||||
help='Save location for packet tracking files.',
|
help="Save location for packet tracking files.",
|
||||||
),
|
),
|
||||||
cfg.BoolOpt(
|
cfg.BoolOpt(
|
||||||
'trace_enabled',
|
"trace_enabled",
|
||||||
default=False,
|
default=False,
|
||||||
help='Enable code tracing',
|
help="Enable code tracing",
|
||||||
),
|
),
|
||||||
cfg.StrOpt(
|
cfg.StrOpt(
|
||||||
'units',
|
"units",
|
||||||
default='imperial',
|
default="imperial",
|
||||||
help='Units for display, imperial or metric',
|
help="Units for display, imperial or metric",
|
||||||
),
|
),
|
||||||
cfg.IntOpt(
|
cfg.IntOpt(
|
||||||
'ack_rate_limit_period',
|
"ack_rate_limit_period",
|
||||||
default=1,
|
default=1,
|
||||||
help='The wait period in seconds per Ack packet being sent.'
|
help="The wait period in seconds per Ack packet being sent."
|
||||||
'1 means 1 ack packet per second allowed.'
|
"1 means 1 ack packet per second allowed."
|
||||||
'2 means 1 pack packet every 2 seconds allowed',
|
"2 means 1 pack packet every 2 seconds allowed",
|
||||||
),
|
),
|
||||||
cfg.IntOpt(
|
cfg.IntOpt(
|
||||||
'msg_rate_limit_period',
|
"msg_rate_limit_period",
|
||||||
default=2,
|
default=2,
|
||||||
help='Wait period in seconds per non AckPacket being sent.'
|
help="Wait period in seconds per non AckPacket being sent."
|
||||||
'2 means 1 packet every 2 seconds allowed.'
|
"2 means 1 packet every 2 seconds allowed."
|
||||||
'5 means 1 pack packet every 5 seconds allowed',
|
"5 means 1 pack packet every 5 seconds allowed",
|
||||||
),
|
),
|
||||||
cfg.IntOpt(
|
cfg.IntOpt(
|
||||||
'packet_dupe_timeout',
|
"packet_dupe_timeout",
|
||||||
default=300,
|
default=300,
|
||||||
help='The number of seconds before a packet is not considered a duplicate.',
|
help="The number of seconds before a packet is not considered a duplicate.",
|
||||||
),
|
),
|
||||||
cfg.BoolOpt(
|
cfg.BoolOpt(
|
||||||
'enable_beacon',
|
"enable_beacon",
|
||||||
default=False,
|
default=False,
|
||||||
help='Enable sending of a GPS Beacon packet to locate this service. '
|
help="Enable sending of a GPS Beacon packet to locate this service. "
|
||||||
'Requires latitude and longitude to be set.',
|
"Requires latitude and longitude to be set.",
|
||||||
),
|
),
|
||||||
cfg.IntOpt(
|
cfg.IntOpt(
|
||||||
'beacon_interval',
|
"beacon_interval",
|
||||||
default=1800,
|
default=1800,
|
||||||
help='The number of seconds between beacon packets.',
|
help="The number of seconds between beacon packets.",
|
||||||
),
|
),
|
||||||
cfg.StrOpt(
|
cfg.StrOpt(
|
||||||
'beacon_symbol',
|
"beacon_symbol",
|
||||||
default='/',
|
default="/",
|
||||||
help='The symbol to use for the GPS Beacon packet. See: http://www.aprs.net/vm/DOS/SYMBOLS.HTM',
|
help="The symbol to use for the GPS Beacon packet. See: http://www.aprs.net/vm/DOS/SYMBOLS.HTM",
|
||||||
),
|
),
|
||||||
cfg.StrOpt(
|
cfg.StrOpt(
|
||||||
'latitude',
|
"latitude",
|
||||||
default=None,
|
default=None,
|
||||||
help='Latitude for the GPS Beacon button. If not set, the button will not be enabled.',
|
help="Latitude for the GPS Beacon button. If not set, the button will not be enabled.",
|
||||||
),
|
),
|
||||||
cfg.StrOpt(
|
cfg.StrOpt(
|
||||||
'longitude',
|
"longitude",
|
||||||
default=None,
|
default=None,
|
||||||
help='Longitude for the GPS Beacon button. If not set, the button will not be enabled.',
|
help="Longitude for the GPS Beacon button. If not set, the button will not be enabled.",
|
||||||
),
|
),
|
||||||
cfg.StrOpt(
|
cfg.StrOpt(
|
||||||
'log_packet_format',
|
"log_packet_format",
|
||||||
choices=['compact', 'multiline', 'both'],
|
choices=["compact", "multiline", "both"],
|
||||||
default='compact',
|
default="compact",
|
||||||
help="When logging packets 'compact' will use a single line formatted for each packet."
|
help="When logging packets 'compact' will use a single line formatted for each packet."
|
||||||
"'multiline' will use multiple lines for each packet and is the traditional format."
|
"'multiline' will use multiple lines for each packet and is the traditional format."
|
||||||
'both will log both compact and multiline.',
|
"both will log both compact and multiline.",
|
||||||
),
|
),
|
||||||
cfg.IntOpt(
|
cfg.IntOpt(
|
||||||
'default_packet_send_count',
|
"default_packet_send_count",
|
||||||
default=3,
|
default=3,
|
||||||
help='The number of times to send a non ack packet before giving up.',
|
help="The number of times to send a non ack packet before giving up.",
|
||||||
),
|
),
|
||||||
cfg.IntOpt(
|
cfg.IntOpt(
|
||||||
'default_ack_send_count',
|
"default_ack_send_count",
|
||||||
default=3,
|
default=3,
|
||||||
help='The number of times to send an ack packet in response to recieving a packet.',
|
help="The number of times to send an ack packet in response to recieving a packet.",
|
||||||
),
|
),
|
||||||
cfg.IntOpt(
|
cfg.IntOpt(
|
||||||
'packet_list_maxlen',
|
"packet_list_maxlen",
|
||||||
default=100,
|
default=100,
|
||||||
help='The maximum number of packets to store in the packet list.',
|
help="The maximum number of packets to store in the packet list.",
|
||||||
),
|
),
|
||||||
cfg.IntOpt(
|
cfg.IntOpt(
|
||||||
'packet_list_stats_maxlen',
|
"packet_list_stats_maxlen",
|
||||||
default=20,
|
default=20,
|
||||||
help='The maximum number of packets to send in the stats dict for admin ui. -1 means no max.',
|
help="The maximum number of packets to send in the stats dict for admin ui.",
|
||||||
),
|
),
|
||||||
cfg.BoolOpt(
|
cfg.BoolOpt(
|
||||||
'enable_seen_list',
|
"enable_seen_list",
|
||||||
default=True,
|
default=True,
|
||||||
help='Enable the Callsign seen list tracking feature. This allows aprsd to keep track of '
|
help="Enable the Callsign seen list tracking feature. This allows aprsd to keep track of "
|
||||||
'callsigns that have been seen and when they were last seen.',
|
"callsigns that have been seen and when they were last seen.",
|
||||||
),
|
),
|
||||||
cfg.BoolOpt(
|
cfg.BoolOpt(
|
||||||
'enable_packet_logging',
|
"enable_packet_logging",
|
||||||
default=True,
|
default=True,
|
||||||
help='Set this to False, to disable logging of packets to the log file.',
|
help="Set this to False, to disable logging of packets to the log file.",
|
||||||
),
|
),
|
||||||
cfg.BoolOpt(
|
cfg.BoolOpt(
|
||||||
'load_help_plugin',
|
"load_help_plugin",
|
||||||
default=True,
|
default=True,
|
||||||
help='Set this to False to disable the help plugin.',
|
help="Set this to False to disable the help plugin.",
|
||||||
),
|
),
|
||||||
cfg.BoolOpt(
|
cfg.BoolOpt(
|
||||||
'enable_sending_ack_packets',
|
"enable_sending_ack_packets",
|
||||||
default=True,
|
default=True,
|
||||||
help='Set this to False, to disable sending of ack packets. This will entirely stop'
|
help="Set this to False, to disable sending of ack packets. This will entirely stop"
|
||||||
'APRSD from sending ack packets.',
|
"APRSD from sending ack packets.",
|
||||||
),
|
),
|
||||||
]
|
]
|
||||||
|
|
||||||
watch_list_opts = [
|
watch_list_opts = [
|
||||||
cfg.BoolOpt(
|
cfg.BoolOpt(
|
||||||
'enabled',
|
"enabled",
|
||||||
default=False,
|
default=False,
|
||||||
help='Enable the watch list feature. Still have to enable '
|
help="Enable the watch list feature. Still have to enable "
|
||||||
'the correct plugin. Built-in plugin to use is '
|
"the correct plugin. Built-in plugin to use is "
|
||||||
'aprsd.plugins.notify.NotifyPlugin',
|
"aprsd.plugins.notify.NotifyPlugin",
|
||||||
),
|
),
|
||||||
cfg.ListOpt(
|
cfg.ListOpt(
|
||||||
'callsigns',
|
"callsigns",
|
||||||
help='Callsigns to watch for messsages',
|
help="Callsigns to watch for messsages",
|
||||||
),
|
),
|
||||||
cfg.StrOpt(
|
cfg.StrOpt(
|
||||||
'alert_callsign',
|
"alert_callsign",
|
||||||
help='The Ham Callsign to send messages to for watch list alerts.',
|
help="The Ham Callsign to send messages to for watch list alerts.",
|
||||||
),
|
),
|
||||||
cfg.IntOpt(
|
cfg.IntOpt(
|
||||||
'packet_keep_count',
|
"packet_keep_count",
|
||||||
default=10,
|
default=10,
|
||||||
help='The number of packets to store.',
|
help="The number of packets to store.",
|
||||||
),
|
),
|
||||||
cfg.IntOpt(
|
cfg.IntOpt(
|
||||||
'alert_time_seconds',
|
"alert_time_seconds",
|
||||||
default=3600,
|
default=3600,
|
||||||
help='Time to wait before alert is sent on new message for users in callsigns.',
|
help="Time to wait before alert is sent on new message for "
|
||||||
|
"users in callsigns.",
|
||||||
),
|
),
|
||||||
]
|
]
|
||||||
|
|
||||||
|
|
||||||
enabled_plugins_opts = [
|
enabled_plugins_opts = [
|
||||||
cfg.ListOpt(
|
cfg.ListOpt(
|
||||||
'enabled_plugins',
|
"enabled_plugins",
|
||||||
default=[
|
default=[
|
||||||
'aprsd.plugins.fortune.FortunePlugin',
|
"aprsd.plugins.fortune.FortunePlugin",
|
||||||
'aprsd.plugins.location.LocationPlugin',
|
"aprsd.plugins.location.LocationPlugin",
|
||||||
'aprsd.plugins.ping.PingPlugin',
|
"aprsd.plugins.ping.PingPlugin",
|
||||||
'aprsd.plugins.time.TimePlugin',
|
"aprsd.plugins.time.TimePlugin",
|
||||||
'aprsd.plugins.weather.OWMWeatherPlugin',
|
"aprsd.plugins.weather.OWMWeatherPlugin",
|
||||||
'aprsd.plugins.version.VersionPlugin',
|
"aprsd.plugins.version.VersionPlugin",
|
||||||
'aprsd.plugins.notify.NotifySeenPlugin',
|
"aprsd.plugins.notify.NotifySeenPlugin",
|
||||||
],
|
],
|
||||||
help='Comma separated list of enabled plugins for APRSD.'
|
help="Comma separated list of enabled plugins for APRSD."
|
||||||
'To enable installed external plugins add them here.'
|
"To enable installed external plugins add them here."
|
||||||
'The full python path to the class name must be used',
|
"The full python path to the class name must be used",
|
||||||
),
|
),
|
||||||
]
|
]
|
||||||
|
|
||||||
registry_opts = [
|
registry_opts = [
|
||||||
cfg.BoolOpt(
|
cfg.BoolOpt(
|
||||||
'enabled',
|
"enabled",
|
||||||
default=False,
|
default=False,
|
||||||
help='Enable sending aprs registry information. This will let the '
|
help="Enable sending aprs registry information. This will let the "
|
||||||
"APRS registry know about your service and it's uptime. "
|
"APRS registry know about your service and it's uptime. "
|
||||||
'No personal information is sent, just the callsign, uptime and description. '
|
"No personal information is sent, just the callsign, uptime and description. "
|
||||||
'The service callsign is the callsign set in [DEFAULT] section.',
|
"The service callsign is the callsign set in [DEFAULT] section.",
|
||||||
),
|
),
|
||||||
cfg.StrOpt(
|
cfg.StrOpt(
|
||||||
'description',
|
"description",
|
||||||
default=None,
|
default=None,
|
||||||
help='Description of the service to send to the APRS registry. '
|
help="Description of the service to send to the APRS registry. "
|
||||||
'This is what will show up in the APRS registry.'
|
"This is what will show up in the APRS registry."
|
||||||
'If not set, the description will be the same as the callsign.',
|
"If not set, the description will be the same as the callsign.",
|
||||||
),
|
),
|
||||||
cfg.StrOpt(
|
cfg.StrOpt(
|
||||||
'registry_url',
|
"registry_url",
|
||||||
default='https://aprs.hemna.com/api/v1/registry',
|
default="https://aprs.hemna.com/api/v1/registry",
|
||||||
help='The APRS registry domain name to send the information to.',
|
help="The APRS registry domain name to send the information to.",
|
||||||
),
|
),
|
||||||
cfg.StrOpt(
|
cfg.StrOpt(
|
||||||
'service_website',
|
"service_website",
|
||||||
default=None,
|
default=None,
|
||||||
help='The website for your APRS service to send to the APRS registry.',
|
help="The website for your APRS service to send to the APRS registry.",
|
||||||
),
|
),
|
||||||
cfg.IntOpt(
|
cfg.IntOpt(
|
||||||
'frequency_seconds',
|
"frequency_seconds",
|
||||||
default=3600,
|
default=3600,
|
||||||
help='The frequency in seconds to send the APRS registry information.',
|
help="The frequency in seconds to send the APRS registry information.",
|
||||||
),
|
),
|
||||||
]
|
]
|
||||||
|
|
||||||
@ -231,7 +232,7 @@ def register_opts(config):
|
|||||||
|
|
||||||
def list_opts():
|
def list_opts():
|
||||||
return {
|
return {
|
||||||
'DEFAULT': (aprsd_opts + enabled_plugins_opts),
|
"DEFAULT": (aprsd_opts + enabled_plugins_opts),
|
||||||
watch_list_group.name: watch_list_opts,
|
watch_list_group.name: watch_list_opts,
|
||||||
registry_group.name: registry_opts,
|
registry_group.name: registry_opts,
|
||||||
}
|
}
|
||||||
|
@ -7,57 +7,47 @@ import logging
|
|||||||
from oslo_config import cfg
|
from oslo_config import cfg
|
||||||
|
|
||||||
LOG_LEVELS = {
|
LOG_LEVELS = {
|
||||||
'CRITICAL': logging.CRITICAL,
|
"CRITICAL": logging.CRITICAL,
|
||||||
'ERROR': logging.ERROR,
|
"ERROR": logging.ERROR,
|
||||||
'WARNING': logging.WARNING,
|
"WARNING": logging.WARNING,
|
||||||
'INFO': logging.INFO,
|
"INFO": logging.INFO,
|
||||||
'DEBUG': logging.DEBUG,
|
"DEBUG": logging.DEBUG,
|
||||||
}
|
}
|
||||||
|
|
||||||
DEFAULT_DATE_FORMAT = '%m/%d/%Y %I:%M:%S %p'
|
DEFAULT_DATE_FORMAT = "%m/%d/%Y %I:%M:%S %p"
|
||||||
DEFAULT_LOG_FORMAT = (
|
DEFAULT_LOG_FORMAT = (
|
||||||
'[%(asctime)s] [%(threadName)-20.20s] [%(levelname)-5.5s]'
|
"[%(asctime)s] [%(threadName)-20.20s] [%(levelname)-5.5s]"
|
||||||
' %(message)s - [%(pathname)s:%(lineno)d]'
|
" %(message)s - [%(pathname)s:%(lineno)d]"
|
||||||
)
|
)
|
||||||
|
|
||||||
DEFAULT_LOG_FORMAT = (
|
DEFAULT_LOG_FORMAT = (
|
||||||
'<green>{time:YYYY-MM-DD HH:mm:ss.SSS}</green> | '
|
"<green>{time:YYYY-MM-DD HH:mm:ss.SSS}</green> | "
|
||||||
'<yellow>{thread.name: <18}</yellow> | '
|
"<yellow>{thread.name: <18}</yellow> | "
|
||||||
'<level>{level: <8}</level> | '
|
"<level>{level: <8}</level> | "
|
||||||
'<level>{message}</level> | '
|
"<level>{message}</level> | "
|
||||||
'<cyan>{name}</cyan>:<cyan>{function:}</cyan>:<magenta>{line:}</magenta>'
|
"<cyan>{name}</cyan>:<cyan>{function:}</cyan>:<magenta>{line:}</magenta>"
|
||||||
)
|
)
|
||||||
|
|
||||||
logging_group = cfg.OptGroup(
|
logging_group = cfg.OptGroup(
|
||||||
name='logging',
|
name="logging",
|
||||||
title='Logging options',
|
title="Logging options",
|
||||||
)
|
)
|
||||||
logging_opts = [
|
logging_opts = [
|
||||||
cfg.StrOpt(
|
cfg.StrOpt(
|
||||||
'logfile',
|
"logfile",
|
||||||
default=None,
|
default=None,
|
||||||
help='File to log to',
|
help="File to log to",
|
||||||
),
|
),
|
||||||
cfg.StrOpt(
|
cfg.StrOpt(
|
||||||
'logformat',
|
"logformat",
|
||||||
default=DEFAULT_LOG_FORMAT,
|
default=DEFAULT_LOG_FORMAT,
|
||||||
help='Log file format, unless rich_logging enabled.',
|
help="Log file format, unless rich_logging enabled.",
|
||||||
),
|
),
|
||||||
cfg.StrOpt(
|
cfg.StrOpt(
|
||||||
'log_level',
|
"log_level",
|
||||||
default='INFO',
|
default="INFO",
|
||||||
choices=LOG_LEVELS.keys(),
|
choices=LOG_LEVELS.keys(),
|
||||||
help='Log level for logging of events.',
|
help="Log level for logging of events.",
|
||||||
),
|
|
||||||
cfg.BoolOpt(
|
|
||||||
'enable_color',
|
|
||||||
default=True,
|
|
||||||
help='Enable ANSI color codes in logging',
|
|
||||||
),
|
|
||||||
cfg.BoolOpt(
|
|
||||||
'enable_console_stdout',
|
|
||||||
default=True,
|
|
||||||
help='Enable logging to the console/stdout.',
|
|
||||||
),
|
),
|
||||||
]
|
]
|
||||||
|
|
||||||
|
@ -51,7 +51,7 @@ class InterceptHandler(logging.Handler):
|
|||||||
# Setup the log faciility
|
# Setup the log faciility
|
||||||
# to disable log to stdout, but still log to file
|
# to disable log to stdout, but still log to file
|
||||||
# use the --quiet option on the cmdln
|
# use the --quiet option on the cmdln
|
||||||
def setup_logging(loglevel=None, quiet=False, custom_handler=None):
|
def setup_logging(loglevel=None, quiet=False):
|
||||||
if not loglevel:
|
if not loglevel:
|
||||||
log_level = CONF.logging.log_level
|
log_level = CONF.logging.log_level
|
||||||
else:
|
else:
|
||||||
@ -63,53 +63,37 @@ def setup_logging(loglevel=None, quiet=False, custom_handler=None):
|
|||||||
|
|
||||||
# We don't really want to see the aprslib parsing debug output.
|
# We don't really want to see the aprslib parsing debug output.
|
||||||
disable_list = [
|
disable_list = [
|
||||||
'aprslib',
|
"aprslib",
|
||||||
'aprslib.parsing',
|
"aprslib.parsing",
|
||||||
'aprslib.exceptions',
|
"aprslib.exceptions",
|
||||||
]
|
]
|
||||||
|
|
||||||
chardet_list = [
|
|
||||||
'chardet',
|
|
||||||
'chardet.charsetprober',
|
|
||||||
'chardet.eucjpprober',
|
|
||||||
]
|
|
||||||
|
|
||||||
for name in chardet_list:
|
|
||||||
disable = logging.getLogger(name)
|
|
||||||
disable.setLevel(logging.ERROR)
|
|
||||||
|
|
||||||
# remove every other logger's handlers
|
# remove every other logger's handlers
|
||||||
# and propagate to root logger
|
# and propagate to root logger
|
||||||
for name in logging.root.manager.loggerDict.keys():
|
for name in logging.root.manager.loggerDict.keys():
|
||||||
logging.getLogger(name).handlers = []
|
logging.getLogger(name).handlers = []
|
||||||
logging.getLogger(name).propagate = name not in disable_list
|
logging.getLogger(name).propagate = name not in disable_list
|
||||||
|
|
||||||
handlers = []
|
handlers = [
|
||||||
if CONF.logging.enable_console_stdout and not quiet:
|
{
|
||||||
handlers.append(
|
"sink": sys.stdout,
|
||||||
{
|
"serialize": False,
|
||||||
'sink': sys.stdout,
|
"format": CONF.logging.logformat,
|
||||||
'serialize': False,
|
"colorize": True,
|
||||||
'format': CONF.logging.logformat,
|
"level": log_level,
|
||||||
'colorize': CONF.logging.enable_color,
|
},
|
||||||
'level': log_level,
|
]
|
||||||
},
|
|
||||||
)
|
|
||||||
|
|
||||||
if CONF.logging.logfile:
|
if CONF.logging.logfile:
|
||||||
handlers.append(
|
handlers.append(
|
||||||
{
|
{
|
||||||
'sink': CONF.logging.logfile,
|
"sink": CONF.logging.logfile,
|
||||||
'serialize': False,
|
"serialize": False,
|
||||||
'format': CONF.logging.logformat,
|
"format": CONF.logging.logformat,
|
||||||
'colorize': False,
|
"colorize": False,
|
||||||
'level': log_level,
|
"level": log_level,
|
||||||
},
|
},
|
||||||
)
|
)
|
||||||
|
|
||||||
if custom_handler:
|
|
||||||
handlers.append(custom_handler)
|
|
||||||
|
|
||||||
# configure loguru
|
# configure loguru
|
||||||
logger.configure(handlers=handlers)
|
logger.configure(handlers=handlers)
|
||||||
logger.level('DEBUG', color='<fg #BABABA>')
|
logger.level("DEBUG", color="<fg #BABABA>")
|
||||||
|
@ -23,6 +23,7 @@
|
|||||||
import datetime
|
import datetime
|
||||||
import importlib.metadata as imp
|
import importlib.metadata as imp
|
||||||
import logging
|
import logging
|
||||||
|
import signal
|
||||||
import sys
|
import sys
|
||||||
import time
|
import time
|
||||||
from importlib.metadata import version as metadata_version
|
from importlib.metadata import version as metadata_version
|
||||||
@ -38,8 +39,9 @@ from aprsd.stats import collector
|
|||||||
# setup the global logger
|
# setup the global logger
|
||||||
# log.basicConfig(level=log.DEBUG) # level=10
|
# log.basicConfig(level=log.DEBUG) # level=10
|
||||||
CONF = cfg.CONF
|
CONF = cfg.CONF
|
||||||
LOG = logging.getLogger('APRSD')
|
LOG = logging.getLogger("APRSD")
|
||||||
CONTEXT_SETTINGS = dict(help_option_names=['-h', '--help'])
|
CONTEXT_SETTINGS = dict(help_option_names=["-h", "--help"])
|
||||||
|
flask_enabled = False
|
||||||
|
|
||||||
|
|
||||||
@click.group(cls=cli_helper.AliasedGroup, context_settings=CONTEXT_SETTINGS)
|
@click.group(cls=cli_helper.AliasedGroup, context_settings=CONTEXT_SETTINGS)
|
||||||
@ -66,16 +68,18 @@ def main():
|
|||||||
# First import all the possible commands for the CLI
|
# First import all the possible commands for the CLI
|
||||||
# The commands themselves live in the cmds directory
|
# The commands themselves live in the cmds directory
|
||||||
load_commands()
|
load_commands()
|
||||||
utils.load_entry_points('aprsd.extension')
|
utils.load_entry_points("aprsd.extension")
|
||||||
cli(auto_envvar_prefix='APRSD')
|
cli(auto_envvar_prefix="APRSD")
|
||||||
|
|
||||||
|
|
||||||
def signal_handler(sig, frame):
|
def signal_handler(sig, frame):
|
||||||
click.echo('signal_handler: called')
|
global flask_enabled
|
||||||
|
|
||||||
|
click.echo("signal_handler: called")
|
||||||
threads.APRSDThreadList().stop_all()
|
threads.APRSDThreadList().stop_all()
|
||||||
if 'subprocess' not in str(frame):
|
if "subprocess" not in str(frame):
|
||||||
LOG.info(
|
LOG.info(
|
||||||
'Ctrl+C, Sending all threads exit! Can take up to 10 seconds {}'.format(
|
"Ctrl+C, Sending all threads exit! Can take up to 10 seconds {}".format(
|
||||||
datetime.datetime.now(),
|
datetime.datetime.now(),
|
||||||
),
|
),
|
||||||
)
|
)
|
||||||
@ -87,11 +91,14 @@ def signal_handler(sig, frame):
|
|||||||
packets.PacketList().save()
|
packets.PacketList().save()
|
||||||
collector.Collector().collect()
|
collector.Collector().collect()
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
LOG.error(f'Failed to save data: {e}')
|
LOG.error(f"Failed to save data: {e}")
|
||||||
sys.exit(0)
|
sys.exit(0)
|
||||||
# signal.signal(signal.SIGTERM, sys.exit(0))
|
# signal.signal(signal.SIGTERM, sys.exit(0))
|
||||||
# sys.exit(0)
|
# sys.exit(0)
|
||||||
|
|
||||||
|
if flask_enabled:
|
||||||
|
signal.signal(signal.SIGTERM, sys.exit(0))
|
||||||
|
|
||||||
|
|
||||||
@cli.command()
|
@cli.command()
|
||||||
@cli_helper.add_options(cli_helper.common_options)
|
@cli_helper.add_options(cli_helper.common_options)
|
||||||
@ -101,9 +108,9 @@ def check_version(ctx):
|
|||||||
"""Check this version against the latest in pypi.org."""
|
"""Check this version against the latest in pypi.org."""
|
||||||
level, msg = utils._check_version()
|
level, msg = utils._check_version()
|
||||||
if level:
|
if level:
|
||||||
click.secho(msg, fg='yellow')
|
click.secho(msg, fg="yellow")
|
||||||
else:
|
else:
|
||||||
click.secho(msg, fg='green')
|
click.secho(msg, fg="green")
|
||||||
|
|
||||||
|
|
||||||
@cli.command()
|
@cli.command()
|
||||||
@ -117,12 +124,12 @@ def sample_config(ctx):
|
|||||||
if sys.version_info < (3, 10):
|
if sys.version_info < (3, 10):
|
||||||
all = imp.entry_points()
|
all = imp.entry_points()
|
||||||
selected = []
|
selected = []
|
||||||
if 'oslo.config.opts' in all:
|
if "oslo.config.opts" in all:
|
||||||
for x in all['oslo.config.opts']:
|
for x in all["oslo.config.opts"]:
|
||||||
if x.group == 'oslo.config.opts':
|
if x.group == "oslo.config.opts":
|
||||||
selected.append(x)
|
selected.append(x)
|
||||||
else:
|
else:
|
||||||
selected = imp.entry_points(group='oslo.config.opts')
|
selected = imp.entry_points(group="oslo.config.opts")
|
||||||
|
|
||||||
return selected
|
return selected
|
||||||
|
|
||||||
@ -132,23 +139,23 @@ def sample_config(ctx):
|
|||||||
# selected = imp.entry_points(group="oslo.config.opts")
|
# selected = imp.entry_points(group="oslo.config.opts")
|
||||||
selected = _get_selected_entry_points()
|
selected = _get_selected_entry_points()
|
||||||
for entry in selected:
|
for entry in selected:
|
||||||
if 'aprsd' in entry.name:
|
if "aprsd" in entry.name:
|
||||||
args.append('--namespace')
|
args.append("--namespace")
|
||||||
args.append(entry.name)
|
args.append(entry.name)
|
||||||
|
|
||||||
return args
|
return args
|
||||||
|
|
||||||
args = get_namespaces()
|
args = get_namespaces()
|
||||||
config_version = metadata_version('oslo.config')
|
config_version = metadata_version("oslo.config")
|
||||||
logging.basicConfig(level=logging.WARN)
|
logging.basicConfig(level=logging.WARN)
|
||||||
conf = cfg.ConfigOpts()
|
conf = cfg.ConfigOpts()
|
||||||
generator.register_cli_opts(conf)
|
generator.register_cli_opts(conf)
|
||||||
try:
|
try:
|
||||||
conf(args, version=config_version)
|
conf(args, version=config_version)
|
||||||
except cfg.RequiredOptError as ex:
|
except cfg.RequiredOptError:
|
||||||
conf.print_help()
|
conf.print_help()
|
||||||
if not sys.argv[1:]:
|
if not sys.argv[1:]:
|
||||||
raise SystemExit from ex
|
raise SystemExit
|
||||||
raise
|
raise
|
||||||
generator.generate(conf)
|
generator.generate(conf)
|
||||||
return
|
return
|
||||||
@ -158,9 +165,9 @@ def sample_config(ctx):
|
|||||||
@click.pass_context
|
@click.pass_context
|
||||||
def version(ctx):
|
def version(ctx):
|
||||||
"""Show the APRSD version."""
|
"""Show the APRSD version."""
|
||||||
click.echo(click.style('APRSD Version : ', fg='white'), nl=False)
|
click.echo(click.style("APRSD Version : ", fg="white"), nl=False)
|
||||||
click.secho(f'{aprsd.__version__}', fg='yellow', bold=True)
|
click.secho(f"{aprsd.__version__}", fg="yellow", bold=True)
|
||||||
|
|
||||||
|
|
||||||
if __name__ == '__main__':
|
if __name__ == "__main__":
|
||||||
main()
|
main()
|
||||||
|
@ -15,8 +15,6 @@ from aprsd.packets.core import ( # noqa: F401
|
|||||||
WeatherPacket,
|
WeatherPacket,
|
||||||
factory,
|
factory,
|
||||||
)
|
)
|
||||||
from aprsd.packets.filter import PacketFilter
|
|
||||||
from aprsd.packets.filters.dupe_filter import DupePacketFilter
|
|
||||||
from aprsd.packets.packet_list import PacketList # noqa: F401
|
from aprsd.packets.packet_list import PacketList # noqa: F401
|
||||||
from aprsd.packets.seen_list import SeenList # noqa: F401
|
from aprsd.packets.seen_list import SeenList # noqa: F401
|
||||||
from aprsd.packets.tracker import PacketTrack # noqa: F401
|
from aprsd.packets.tracker import PacketTrack # noqa: F401
|
||||||
@ -28,9 +26,5 @@ collector.PacketCollector().register(SeenList)
|
|||||||
collector.PacketCollector().register(PacketTrack)
|
collector.PacketCollector().register(PacketTrack)
|
||||||
collector.PacketCollector().register(WatchList)
|
collector.PacketCollector().register(WatchList)
|
||||||
|
|
||||||
# Register all the packet filters for normal processing
|
|
||||||
# For specific commands you can deregister these if you don't want them.
|
|
||||||
PacketFilter().register(DupePacketFilter)
|
|
||||||
|
|
||||||
|
|
||||||
NULL_MESSAGE = -1
|
NULL_MESSAGE = -1
|
||||||
|
@ -19,26 +19,26 @@ from loguru import logger
|
|||||||
from aprsd.utils import counter
|
from aprsd.utils import counter
|
||||||
|
|
||||||
# For mypy to be happy
|
# For mypy to be happy
|
||||||
A = TypeVar('A', bound='DataClassJsonMixin')
|
A = TypeVar("A", bound="DataClassJsonMixin")
|
||||||
Json = Union[dict, list, str, int, float, bool, None]
|
Json = Union[dict, list, str, int, float, bool, None]
|
||||||
|
|
||||||
LOG = logging.getLogger()
|
LOG = logging.getLogger()
|
||||||
LOGU = logger
|
LOGU = logger
|
||||||
|
|
||||||
PACKET_TYPE_BULLETIN = 'bulletin'
|
PACKET_TYPE_BULLETIN = "bulletin"
|
||||||
PACKET_TYPE_MESSAGE = 'message'
|
PACKET_TYPE_MESSAGE = "message"
|
||||||
PACKET_TYPE_ACK = 'ack'
|
PACKET_TYPE_ACK = "ack"
|
||||||
PACKET_TYPE_REJECT = 'reject'
|
PACKET_TYPE_REJECT = "reject"
|
||||||
PACKET_TYPE_MICE = 'mic-e'
|
PACKET_TYPE_MICE = "mic-e"
|
||||||
PACKET_TYPE_WX = 'wx'
|
PACKET_TYPE_WX = "wx"
|
||||||
PACKET_TYPE_WEATHER = 'weather'
|
PACKET_TYPE_WEATHER = "weather"
|
||||||
PACKET_TYPE_OBJECT = 'object'
|
PACKET_TYPE_OBJECT = "object"
|
||||||
PACKET_TYPE_UNKNOWN = 'unknown'
|
PACKET_TYPE_UNKNOWN = "unknown"
|
||||||
PACKET_TYPE_STATUS = 'status'
|
PACKET_TYPE_STATUS = "status"
|
||||||
PACKET_TYPE_BEACON = 'beacon'
|
PACKET_TYPE_BEACON = "beacon"
|
||||||
PACKET_TYPE_THIRDPARTY = 'thirdparty'
|
PACKET_TYPE_THIRDPARTY = "thirdparty"
|
||||||
PACKET_TYPE_TELEMETRY = 'telemetry-message'
|
PACKET_TYPE_TELEMETRY = "telemetry-message"
|
||||||
PACKET_TYPE_UNCOMPRESSED = 'uncompressed'
|
PACKET_TYPE_UNCOMPRESSED = "uncompressed"
|
||||||
|
|
||||||
NO_DATE = datetime(1900, 10, 24)
|
NO_DATE = datetime(1900, 10, 24)
|
||||||
|
|
||||||
@ -67,14 +67,14 @@ def _init_msgNo(): # noqa: N802
|
|||||||
|
|
||||||
def _translate_fields(raw: dict) -> dict:
|
def _translate_fields(raw: dict) -> dict:
|
||||||
# Direct key checks instead of iteration
|
# Direct key checks instead of iteration
|
||||||
if 'from' in raw:
|
if "from" in raw:
|
||||||
raw['from_call'] = raw.pop('from')
|
raw["from_call"] = raw.pop("from")
|
||||||
if 'to' in raw:
|
if "to" in raw:
|
||||||
raw['to_call'] = raw.pop('to')
|
raw["to_call"] = raw.pop("to")
|
||||||
|
|
||||||
# addresse overrides to_call
|
# addresse overrides to_call
|
||||||
if 'addresse' in raw:
|
if "addresse" in raw:
|
||||||
raw['to_call'] = raw['addresse']
|
raw["to_call"] = raw["addresse"]
|
||||||
|
|
||||||
return raw
|
return raw
|
||||||
|
|
||||||
@ -82,7 +82,7 @@ def _translate_fields(raw: dict) -> dict:
|
|||||||
@dataclass_json
|
@dataclass_json
|
||||||
@dataclass(unsafe_hash=True)
|
@dataclass(unsafe_hash=True)
|
||||||
class Packet:
|
class Packet:
|
||||||
_type: str = field(default='Packet', hash=False)
|
_type: str = field(default="Packet", hash=False)
|
||||||
from_call: Optional[str] = field(default=None)
|
from_call: Optional[str] = field(default=None)
|
||||||
to_call: Optional[str] = field(default=None)
|
to_call: Optional[str] = field(default=None)
|
||||||
addresse: Optional[str] = field(default=None)
|
addresse: Optional[str] = field(default=None)
|
||||||
@ -106,8 +106,6 @@ class Packet:
|
|||||||
last_send_time: float = field(repr=False, default=0, compare=False, hash=False)
|
last_send_time: float = field(repr=False, default=0, compare=False, hash=False)
|
||||||
# Was the packet acked?
|
# Was the packet acked?
|
||||||
acked: bool = field(repr=False, default=False, compare=False, hash=False)
|
acked: bool = field(repr=False, default=False, compare=False, hash=False)
|
||||||
# Was the packet previously processed (for dupe checking)
|
|
||||||
processed: bool = field(repr=False, default=False, compare=False, hash=False)
|
|
||||||
|
|
||||||
# Do we allow this packet to be saved to send later?
|
# Do we allow this packet to be saved to send later?
|
||||||
allow_delay: bool = field(repr=False, default=True, compare=False, hash=False)
|
allow_delay: bool = field(repr=False, default=True, compare=False, hash=False)
|
||||||
@ -120,7 +118,7 @@ class Packet:
|
|||||||
@property
|
@property
|
||||||
def key(self) -> str:
|
def key(self) -> str:
|
||||||
"""Build a key for finding this packet in a dict."""
|
"""Build a key for finding this packet in a dict."""
|
||||||
return f'{self.from_call}:{self.addresse}:{self.msgNo}'
|
return f"{self.from_call}:{self.addresse}:{self.msgNo}"
|
||||||
|
|
||||||
def update_timestamp(self) -> None:
|
def update_timestamp(self) -> None:
|
||||||
self.timestamp = _init_timestamp()
|
self.timestamp = _init_timestamp()
|
||||||
@ -133,7 +131,7 @@ class Packet:
|
|||||||
the human readable payload.
|
the human readable payload.
|
||||||
"""
|
"""
|
||||||
self.prepare()
|
self.prepare()
|
||||||
msg = self._filter_for_send(self.raw).rstrip('\n')
|
msg = self._filter_for_send(self.raw).rstrip("\n")
|
||||||
return msg
|
return msg
|
||||||
|
|
||||||
def prepare(self, create_msg_number=False) -> None:
|
def prepare(self, create_msg_number=False) -> None:
|
||||||
@ -152,11 +150,11 @@ class Packet:
|
|||||||
)
|
)
|
||||||
|
|
||||||
# The base packet class has no real payload
|
# The base packet class has no real payload
|
||||||
self.payload = f':{self.to_call.ljust(9)}'
|
self.payload = f":{self.to_call.ljust(9)}"
|
||||||
|
|
||||||
def _build_raw(self) -> None:
|
def _build_raw(self) -> None:
|
||||||
"""Build the self.raw which is what is sent over the air."""
|
"""Build the self.raw which is what is sent over the air."""
|
||||||
self.raw = '{}>APZ100:{}'.format(
|
self.raw = "{}>APZ100:{}".format(
|
||||||
self.from_call,
|
self.from_call,
|
||||||
self.payload,
|
self.payload,
|
||||||
)
|
)
|
||||||
@ -168,13 +166,13 @@ class Packet:
|
|||||||
# 67 displays 64 on the ftm400. (+3 {01 suffix)
|
# 67 displays 64 on the ftm400. (+3 {01 suffix)
|
||||||
# feature req: break long ones into two msgs
|
# feature req: break long ones into two msgs
|
||||||
if not msg:
|
if not msg:
|
||||||
return ''
|
return ""
|
||||||
|
|
||||||
message = msg[:67]
|
message = msg[:67]
|
||||||
# We all miss George Carlin
|
# We all miss George Carlin
|
||||||
return re.sub(
|
return re.sub(
|
||||||
'fuck|shit|cunt|piss|cock|bitch',
|
"fuck|shit|cunt|piss|cock|bitch",
|
||||||
'****',
|
"****",
|
||||||
message,
|
message,
|
||||||
flags=re.IGNORECASE,
|
flags=re.IGNORECASE,
|
||||||
)
|
)
|
||||||
@ -183,98 +181,101 @@ class Packet:
|
|||||||
"""Show the raw version of the packet"""
|
"""Show the raw version of the packet"""
|
||||||
self.prepare()
|
self.prepare()
|
||||||
if not self.raw:
|
if not self.raw:
|
||||||
raise ValueError('self.raw is unset')
|
raise ValueError("self.raw is unset")
|
||||||
return self.raw
|
return self.raw
|
||||||
|
|
||||||
def __repr__(self) -> str:
|
def __repr__(self) -> str:
|
||||||
"""Build the repr version of the packet."""
|
"""Build the repr version of the packet."""
|
||||||
return (
|
repr = (
|
||||||
f'{self.__class__.__name__}: From: {self.from_call} To: {self.to_call}'
|
f"{self.__class__.__name__}:"
|
||||||
|
f" From: {self.from_call} "
|
||||||
|
f" To: {self.to_call}"
|
||||||
)
|
)
|
||||||
|
return repr
|
||||||
|
|
||||||
|
|
||||||
@dataclass_json
|
@dataclass_json
|
||||||
@dataclass(unsafe_hash=True)
|
@dataclass(unsafe_hash=True)
|
||||||
class AckPacket(Packet):
|
class AckPacket(Packet):
|
||||||
_type: str = field(default='AckPacket', hash=False)
|
_type: str = field(default="AckPacket", hash=False)
|
||||||
|
|
||||||
def _build_payload(self):
|
def _build_payload(self):
|
||||||
self.payload = f':{self.to_call: <9}:ack{self.msgNo}'
|
self.payload = f":{self.to_call: <9}:ack{self.msgNo}"
|
||||||
|
|
||||||
|
|
||||||
@dataclass_json
|
@dataclass_json
|
||||||
@dataclass(unsafe_hash=True)
|
@dataclass(unsafe_hash=True)
|
||||||
class BulletinPacket(Packet):
|
class BulletinPacket(Packet):
|
||||||
_type: str = 'BulletinPacket'
|
_type: str = "BulletinPacket"
|
||||||
# Holds the encapsulated packet
|
# Holds the encapsulated packet
|
||||||
bid: Optional[str] = field(default='1')
|
bid: Optional[str] = field(default="1")
|
||||||
message_text: Optional[str] = field(default=None)
|
message_text: Optional[str] = field(default=None)
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def key(self) -> str:
|
def key(self) -> str:
|
||||||
"""Build a key for finding this packet in a dict."""
|
"""Build a key for finding this packet in a dict."""
|
||||||
return f'{self.from_call}:BLN{self.bid}'
|
return f"{self.from_call}:BLN{self.bid}"
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def human_info(self) -> str:
|
def human_info(self) -> str:
|
||||||
return f'BLN{self.bid} {self.message_text}'
|
return f"BLN{self.bid} {self.message_text}"
|
||||||
|
|
||||||
def _build_payload(self) -> None:
|
def _build_payload(self) -> None:
|
||||||
self.payload = f':BLN{self.bid:<9}:{self.message_text}'
|
self.payload = f":BLN{self.bid:<9}" f":{self.message_text}"
|
||||||
|
|
||||||
|
|
||||||
@dataclass_json
|
@dataclass_json
|
||||||
@dataclass(unsafe_hash=True)
|
@dataclass(unsafe_hash=True)
|
||||||
class RejectPacket(Packet):
|
class RejectPacket(Packet):
|
||||||
_type: str = field(default='RejectPacket', hash=False)
|
_type: str = field(default="RejectPacket", hash=False)
|
||||||
response: Optional[str] = field(default=None)
|
response: Optional[str] = field(default=None)
|
||||||
|
|
||||||
def __post__init__(self):
|
def __post__init__(self):
|
||||||
if self.response:
|
if self.response:
|
||||||
LOG.warning('Response set!')
|
LOG.warning("Response set!")
|
||||||
|
|
||||||
def _build_payload(self):
|
def _build_payload(self):
|
||||||
self.payload = f':{self.to_call: <9}:rej{self.msgNo}'
|
self.payload = f":{self.to_call: <9}:rej{self.msgNo}"
|
||||||
|
|
||||||
|
|
||||||
@dataclass_json
|
@dataclass_json
|
||||||
@dataclass(unsafe_hash=True)
|
@dataclass(unsafe_hash=True)
|
||||||
class MessagePacket(Packet):
|
class MessagePacket(Packet):
|
||||||
_type: str = field(default='MessagePacket', hash=False)
|
_type: str = field(default="MessagePacket", hash=False)
|
||||||
message_text: Optional[str] = field(default=None)
|
message_text: Optional[str] = field(default=None)
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def human_info(self) -> str:
|
def human_info(self) -> str:
|
||||||
self.prepare()
|
self.prepare()
|
||||||
return self._filter_for_send(self.message_text).rstrip('\n')
|
return self._filter_for_send(self.message_text).rstrip("\n")
|
||||||
|
|
||||||
def _build_payload(self):
|
def _build_payload(self):
|
||||||
if self.msgNo:
|
if self.msgNo:
|
||||||
self.payload = ':{}:{}{{{}'.format(
|
self.payload = ":{}:{}{{{}".format(
|
||||||
self.to_call.ljust(9),
|
self.to_call.ljust(9),
|
||||||
self._filter_for_send(self.message_text).rstrip('\n'),
|
self._filter_for_send(self.message_text).rstrip("\n"),
|
||||||
str(self.msgNo),
|
str(self.msgNo),
|
||||||
)
|
)
|
||||||
else:
|
else:
|
||||||
self.payload = ':{}:{}'.format(
|
self.payload = ":{}:{}".format(
|
||||||
self.to_call.ljust(9),
|
self.to_call.ljust(9),
|
||||||
self._filter_for_send(self.message_text).rstrip('\n'),
|
self._filter_for_send(self.message_text).rstrip("\n"),
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
@dataclass_json
|
@dataclass_json
|
||||||
@dataclass(unsafe_hash=True)
|
@dataclass(unsafe_hash=True)
|
||||||
class StatusPacket(Packet):
|
class StatusPacket(Packet):
|
||||||
_type: str = field(default='StatusPacket', hash=False)
|
_type: str = field(default="StatusPacket", hash=False)
|
||||||
status: Optional[str] = field(default=None)
|
status: Optional[str] = field(default=None)
|
||||||
messagecapable: bool = field(default=False)
|
messagecapable: bool = field(default=False)
|
||||||
comment: Optional[str] = field(default=None)
|
comment: Optional[str] = field(default=None)
|
||||||
raw_timestamp: Optional[str] = field(default=None)
|
raw_timestamp: Optional[str] = field(default=None)
|
||||||
|
|
||||||
def _build_payload(self):
|
def _build_payload(self):
|
||||||
self.payload = ':{}:{}{{{}'.format(
|
self.payload = ":{}:{}{{{}".format(
|
||||||
self.to_call.ljust(9),
|
self.to_call.ljust(9),
|
||||||
self._filter_for_send(self.status).rstrip('\n'),
|
self._filter_for_send(self.status).rstrip("\n"),
|
||||||
str(self.msgNo),
|
str(self.msgNo),
|
||||||
)
|
)
|
||||||
|
|
||||||
@ -287,7 +288,7 @@ class StatusPacket(Packet):
|
|||||||
@dataclass_json
|
@dataclass_json
|
||||||
@dataclass(unsafe_hash=True)
|
@dataclass(unsafe_hash=True)
|
||||||
class GPSPacket(Packet):
|
class GPSPacket(Packet):
|
||||||
_type: str = field(default='GPSPacket', hash=False)
|
_type: str = field(default="GPSPacket", hash=False)
|
||||||
latitude: float = field(default=0.00)
|
latitude: float = field(default=0.00)
|
||||||
longitude: float = field(default=0.00)
|
longitude: float = field(default=0.00)
|
||||||
altitude: float = field(default=0.00)
|
altitude: float = field(default=0.00)
|
||||||
@ -295,8 +296,8 @@ class GPSPacket(Packet):
|
|||||||
posambiguity: int = field(default=0)
|
posambiguity: int = field(default=0)
|
||||||
messagecapable: bool = field(default=False)
|
messagecapable: bool = field(default=False)
|
||||||
comment: Optional[str] = field(default=None)
|
comment: Optional[str] = field(default=None)
|
||||||
symbol: str = field(default='l')
|
symbol: str = field(default="l")
|
||||||
symbol_table: str = field(default='/')
|
symbol_table: str = field(default="/")
|
||||||
raw_timestamp: Optional[str] = field(default=None)
|
raw_timestamp: Optional[str] = field(default=None)
|
||||||
object_name: Optional[str] = field(default=None)
|
object_name: Optional[str] = field(default=None)
|
||||||
object_format: Optional[str] = field(default=None)
|
object_format: Optional[str] = field(default=None)
|
||||||
@ -316,7 +317,7 @@ class GPSPacket(Packet):
|
|||||||
def _build_time_zulu(self):
|
def _build_time_zulu(self):
|
||||||
"""Build the timestamp in UTC/zulu."""
|
"""Build the timestamp in UTC/zulu."""
|
||||||
if self.timestamp:
|
if self.timestamp:
|
||||||
return datetime.utcfromtimestamp(self.timestamp).strftime('%d%H%M')
|
return datetime.utcfromtimestamp(self.timestamp).strftime("%d%H%M")
|
||||||
|
|
||||||
def _build_payload(self):
|
def _build_payload(self):
|
||||||
"""The payload is the non headers portion of the packet."""
|
"""The payload is the non headers portion of the packet."""
|
||||||
@ -324,7 +325,7 @@ class GPSPacket(Packet):
|
|||||||
lat = aprslib_util.latitude_to_ddm(self.latitude)
|
lat = aprslib_util.latitude_to_ddm(self.latitude)
|
||||||
long = aprslib_util.longitude_to_ddm(self.longitude)
|
long = aprslib_util.longitude_to_ddm(self.longitude)
|
||||||
payload = [
|
payload = [
|
||||||
'@' if self.timestamp else '!',
|
"@" if self.timestamp else "!",
|
||||||
time_zulu,
|
time_zulu,
|
||||||
lat,
|
lat,
|
||||||
self.symbol_table,
|
self.symbol_table,
|
||||||
@ -335,34 +336,34 @@ class GPSPacket(Packet):
|
|||||||
if self.comment:
|
if self.comment:
|
||||||
payload.append(self._filter_for_send(self.comment))
|
payload.append(self._filter_for_send(self.comment))
|
||||||
|
|
||||||
self.payload = ''.join(payload)
|
self.payload = "".join(payload)
|
||||||
|
|
||||||
def _build_raw(self):
|
def _build_raw(self):
|
||||||
self.raw = f'{self.from_call}>{self.to_call},WIDE2-1:{self.payload}'
|
self.raw = f"{self.from_call}>{self.to_call},WIDE2-1:" f"{self.payload}"
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def human_info(self) -> str:
|
def human_info(self) -> str:
|
||||||
h_str = []
|
h_str = []
|
||||||
h_str.append(f'Lat:{self.latitude:03.3f}')
|
h_str.append(f"Lat:{self.latitude:03.3f}")
|
||||||
h_str.append(f'Lon:{self.longitude:03.3f}')
|
h_str.append(f"Lon:{self.longitude:03.3f}")
|
||||||
if self.altitude:
|
if self.altitude:
|
||||||
h_str.append(f'Altitude {self.altitude:03.0f}')
|
h_str.append(f"Altitude {self.altitude:03.0f}")
|
||||||
if self.speed:
|
if self.speed:
|
||||||
h_str.append(f'Speed {self.speed:03.0f}MPH')
|
h_str.append(f"Speed {self.speed:03.0f}MPH")
|
||||||
if self.course:
|
if self.course:
|
||||||
h_str.append(f'Course {self.course:03.0f}')
|
h_str.append(f"Course {self.course:03.0f}")
|
||||||
if self.rng:
|
if self.rng:
|
||||||
h_str.append(f'RNG {self.rng:03.0f}')
|
h_str.append(f"RNG {self.rng:03.0f}")
|
||||||
if self.phg:
|
if self.phg:
|
||||||
h_str.append(f'PHG {self.phg}')
|
h_str.append(f"PHG {self.phg}")
|
||||||
|
|
||||||
return ' '.join(h_str)
|
return " ".join(h_str)
|
||||||
|
|
||||||
|
|
||||||
@dataclass_json
|
@dataclass_json
|
||||||
@dataclass(unsafe_hash=True)
|
@dataclass(unsafe_hash=True)
|
||||||
class BeaconPacket(GPSPacket):
|
class BeaconPacket(GPSPacket):
|
||||||
_type: str = field(default='BeaconPacket', hash=False)
|
_type: str = field(default="BeaconPacket", hash=False)
|
||||||
|
|
||||||
def _build_payload(self):
|
def _build_payload(self):
|
||||||
"""The payload is the non headers portion of the packet."""
|
"""The payload is the non headers portion of the packet."""
|
||||||
@ -370,42 +371,42 @@ class BeaconPacket(GPSPacket):
|
|||||||
lat = aprslib_util.latitude_to_ddm(self.latitude)
|
lat = aprslib_util.latitude_to_ddm(self.latitude)
|
||||||
lon = aprslib_util.longitude_to_ddm(self.longitude)
|
lon = aprslib_util.longitude_to_ddm(self.longitude)
|
||||||
|
|
||||||
self.payload = f'@{time_zulu}z{lat}{self.symbol_table}{lon}'
|
self.payload = f"@{time_zulu}z{lat}{self.symbol_table}" f"{lon}"
|
||||||
|
|
||||||
if self.comment:
|
if self.comment:
|
||||||
comment = self._filter_for_send(self.comment)
|
comment = self._filter_for_send(self.comment)
|
||||||
self.payload = f'{self.payload}{self.symbol}{comment}'
|
self.payload = f"{self.payload}{self.symbol}{comment}"
|
||||||
else:
|
else:
|
||||||
self.payload = f'{self.payload}{self.symbol}APRSD Beacon'
|
self.payload = f"{self.payload}{self.symbol}APRSD Beacon"
|
||||||
|
|
||||||
def _build_raw(self):
|
def _build_raw(self):
|
||||||
self.raw = f'{self.from_call}>APZ100:{self.payload}'
|
self.raw = f"{self.from_call}>APZ100:" f"{self.payload}"
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def key(self) -> str:
|
def key(self) -> str:
|
||||||
"""Build a key for finding this packet in a dict."""
|
"""Build a key for finding this packet in a dict."""
|
||||||
if self.raw_timestamp:
|
if self.raw_timestamp:
|
||||||
return f'{self.from_call}:{self.raw_timestamp}'
|
return f"{self.from_call}:{self.raw_timestamp}"
|
||||||
else:
|
else:
|
||||||
return f'{self.from_call}:{self.human_info.replace(" ", "")}'
|
return f"{self.from_call}:{self.human_info.replace(' ', '')}"
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def human_info(self) -> str:
|
def human_info(self) -> str:
|
||||||
h_str = []
|
h_str = []
|
||||||
h_str.append(f'Lat:{self.latitude:03.3f}')
|
h_str.append(f"Lat:{self.latitude:03.3f}")
|
||||||
h_str.append(f'Lon:{self.longitude:03.3f}')
|
h_str.append(f"Lon:{self.longitude:03.3f}")
|
||||||
h_str.append(f'{self.comment}')
|
h_str.append(f"{self.comment}")
|
||||||
return ' '.join(h_str)
|
return " ".join(h_str)
|
||||||
|
|
||||||
|
|
||||||
@dataclass_json
|
@dataclass_json
|
||||||
@dataclass(unsafe_hash=True)
|
@dataclass(unsafe_hash=True)
|
||||||
class MicEPacket(GPSPacket):
|
class MicEPacket(GPSPacket):
|
||||||
_type: str = field(default='MicEPacket', hash=False)
|
_type: str = field(default="MicEPacket", hash=False)
|
||||||
messagecapable: bool = False
|
messagecapable: bool = False
|
||||||
mbits: Optional[str] = None
|
mbits: Optional[str] = None
|
||||||
mtype: Optional[str] = None
|
mtype: Optional[str] = None
|
||||||
telemetry: Optional[dict] = field(default=None, hash=False)
|
telemetry: Optional[dict] = field(default=None)
|
||||||
# in MPH
|
# in MPH
|
||||||
speed: float = 0.00
|
speed: float = 0.00
|
||||||
# 0 to 360
|
# 0 to 360
|
||||||
@ -414,24 +415,24 @@ class MicEPacket(GPSPacket):
|
|||||||
@property
|
@property
|
||||||
def key(self) -> str:
|
def key(self) -> str:
|
||||||
"""Build a key for finding this packet in a dict."""
|
"""Build a key for finding this packet in a dict."""
|
||||||
return f'{self.from_call}:{self.human_info.replace(" ", "")}'
|
return f"{self.from_call}:{self.human_info.replace(' ', '')}"
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def human_info(self) -> str:
|
def human_info(self) -> str:
|
||||||
h_info = super().human_info
|
h_info = super().human_info
|
||||||
return f'{h_info} {self.mbits} mbits'
|
return f"{h_info} {self.mbits} mbits"
|
||||||
|
|
||||||
|
|
||||||
@dataclass_json
|
@dataclass_json
|
||||||
@dataclass(unsafe_hash=True)
|
@dataclass(unsafe_hash=True)
|
||||||
class TelemetryPacket(GPSPacket):
|
class TelemetryPacket(GPSPacket):
|
||||||
_type: str = field(default='TelemetryPacket', hash=False)
|
_type: str = field(default="TelemetryPacket", hash=False)
|
||||||
messagecapable: bool = False
|
messagecapable: bool = False
|
||||||
mbits: Optional[str] = None
|
mbits: Optional[str] = None
|
||||||
mtype: Optional[str] = None
|
mtype: Optional[str] = None
|
||||||
telemetry: Optional[dict] = field(default=None)
|
telemetry: Optional[dict] = field(default=None)
|
||||||
tPARM: Optional[list[str]] = field(default=None, hash=False) # noqa: N815
|
tPARM: Optional[list[str]] = field(default=None) # noqa: N815
|
||||||
tUNIT: Optional[list[str]] = field(default=None, hash=False) # noqa: N815
|
tUNIT: Optional[list[str]] = field(default=None) # noqa: N815
|
||||||
# in MPH
|
# in MPH
|
||||||
speed: float = 0.00
|
speed: float = 0.00
|
||||||
# 0 to 360
|
# 0 to 360
|
||||||
@ -441,23 +442,23 @@ class TelemetryPacket(GPSPacket):
|
|||||||
def key(self) -> str:
|
def key(self) -> str:
|
||||||
"""Build a key for finding this packet in a dict."""
|
"""Build a key for finding this packet in a dict."""
|
||||||
if self.raw_timestamp:
|
if self.raw_timestamp:
|
||||||
return f'{self.from_call}:{self.raw_timestamp}'
|
return f"{self.from_call}:{self.raw_timestamp}"
|
||||||
else:
|
else:
|
||||||
return f'{self.from_call}:{self.human_info.replace(" ", "")}'
|
return f"{self.from_call}:{self.human_info.replace(' ', '')}"
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def human_info(self) -> str:
|
def human_info(self) -> str:
|
||||||
h_info = super().human_info
|
h_info = super().human_info
|
||||||
return f'{h_info} {self.telemetry}'
|
return f"{h_info} {self.telemetry}"
|
||||||
|
|
||||||
|
|
||||||
@dataclass_json
|
@dataclass_json
|
||||||
@dataclass(unsafe_hash=True)
|
@dataclass(unsafe_hash=True)
|
||||||
class ObjectPacket(GPSPacket):
|
class ObjectPacket(GPSPacket):
|
||||||
_type: str = field(default='ObjectPacket', hash=False)
|
_type: str = field(default="ObjectPacket", hash=False)
|
||||||
alive: bool = True
|
alive: bool = True
|
||||||
raw_timestamp: Optional[str] = None
|
raw_timestamp: Optional[str] = None
|
||||||
symbol: str = field(default='r')
|
symbol: str = field(default="r")
|
||||||
# in MPH
|
# in MPH
|
||||||
speed: float = 0.00
|
speed: float = 0.00
|
||||||
# 0 to 360
|
# 0 to 360
|
||||||
@ -468,11 +469,11 @@ class ObjectPacket(GPSPacket):
|
|||||||
lat = aprslib_util.latitude_to_ddm(self.latitude)
|
lat = aprslib_util.latitude_to_ddm(self.latitude)
|
||||||
long = aprslib_util.longitude_to_ddm(self.longitude)
|
long = aprslib_util.longitude_to_ddm(self.longitude)
|
||||||
|
|
||||||
self.payload = f'*{time_zulu}z{lat}{self.symbol_table}{long}{self.symbol}'
|
self.payload = f"*{time_zulu}z{lat}{self.symbol_table}" f"{long}{self.symbol}"
|
||||||
|
|
||||||
if self.comment:
|
if self.comment:
|
||||||
comment = self._filter_for_send(self.comment)
|
comment = self._filter_for_send(self.comment)
|
||||||
self.payload = f'{self.payload}{comment}'
|
self.payload = f"{self.payload}{comment}"
|
||||||
|
|
||||||
def _build_raw(self):
|
def _build_raw(self):
|
||||||
"""
|
"""
|
||||||
@ -485,18 +486,18 @@ class ObjectPacket(GPSPacket):
|
|||||||
The frequency, uplink_tone, offset is part of the comment
|
The frequency, uplink_tone, offset is part of the comment
|
||||||
"""
|
"""
|
||||||
|
|
||||||
self.raw = f'{self.from_call}>APZ100:;{self.to_call:9s}{self.payload}'
|
self.raw = f"{self.from_call}>APZ100:;{self.to_call:9s}" f"{self.payload}"
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def human_info(self) -> str:
|
def human_info(self) -> str:
|
||||||
h_info = super().human_info
|
h_info = super().human_info
|
||||||
return f'{h_info} {self.comment}'
|
return f"{h_info} {self.comment}"
|
||||||
|
|
||||||
|
|
||||||
@dataclass(unsafe_hash=True)
|
@dataclass(unsafe_hash=True)
|
||||||
class WeatherPacket(GPSPacket, DataClassJsonMixin):
|
class WeatherPacket(GPSPacket, DataClassJsonMixin):
|
||||||
_type: str = field(default='WeatherPacket', hash=False)
|
_type: str = field(default="WeatherPacket", hash=False)
|
||||||
symbol: str = '_'
|
symbol: str = "_"
|
||||||
wind_speed: float = 0.00
|
wind_speed: float = 0.00
|
||||||
wind_direction: int = 0
|
wind_direction: int = 0
|
||||||
wind_gust: float = 0.00
|
wind_gust: float = 0.00
|
||||||
@ -514,8 +515,8 @@ class WeatherPacket(GPSPacket, DataClassJsonMixin):
|
|||||||
speed: Optional[float] = field(default=None)
|
speed: Optional[float] = field(default=None)
|
||||||
|
|
||||||
def _translate(self, raw: dict) -> dict:
|
def _translate(self, raw: dict) -> dict:
|
||||||
for key in raw['weather']:
|
for key in raw["weather"]:
|
||||||
raw[key] = raw['weather'][key]
|
raw[key] = raw["weather"][key]
|
||||||
|
|
||||||
# If we have the broken aprslib, then we need to
|
# If we have the broken aprslib, then we need to
|
||||||
# Convert the course and speed to wind_speed and wind_direction
|
# Convert the course and speed to wind_speed and wind_direction
|
||||||
@ -523,36 +524,36 @@ class WeatherPacket(GPSPacket, DataClassJsonMixin):
|
|||||||
# https://github.com/rossengeorgiev/aprs-python/issues/80
|
# https://github.com/rossengeorgiev/aprs-python/issues/80
|
||||||
# Wind speed and course is option in the SPEC.
|
# Wind speed and course is option in the SPEC.
|
||||||
# For some reason aprslib multiplies the speed by 1.852.
|
# For some reason aprslib multiplies the speed by 1.852.
|
||||||
if 'wind_speed' not in raw and 'wind_direction' not in raw:
|
if "wind_speed" not in raw and "wind_direction" not in raw:
|
||||||
# Most likely this is the broken aprslib
|
# Most likely this is the broken aprslib
|
||||||
# So we need to convert the wind_gust speed
|
# So we need to convert the wind_gust speed
|
||||||
raw['wind_gust'] = round(raw.get('wind_gust', 0) / 0.44704, 3)
|
raw["wind_gust"] = round(raw.get("wind_gust", 0) / 0.44704, 3)
|
||||||
if 'wind_speed' not in raw:
|
if "wind_speed" not in raw:
|
||||||
wind_speed = raw.get('speed')
|
wind_speed = raw.get("speed")
|
||||||
if wind_speed:
|
if wind_speed:
|
||||||
raw['wind_speed'] = round(wind_speed / 1.852, 3)
|
raw["wind_speed"] = round(wind_speed / 1.852, 3)
|
||||||
raw['weather']['wind_speed'] = raw['wind_speed']
|
raw["weather"]["wind_speed"] = raw["wind_speed"]
|
||||||
if 'speed' in raw:
|
if "speed" in raw:
|
||||||
del raw['speed']
|
del raw["speed"]
|
||||||
# Let's adjust the rain numbers as well, since it's wrong
|
# Let's adjust the rain numbers as well, since it's wrong
|
||||||
raw['rain_1h'] = round((raw.get('rain_1h', 0) / 0.254) * 0.01, 3)
|
raw["rain_1h"] = round((raw.get("rain_1h", 0) / 0.254) * 0.01, 3)
|
||||||
raw['weather']['rain_1h'] = raw['rain_1h']
|
raw["weather"]["rain_1h"] = raw["rain_1h"]
|
||||||
raw['rain_24h'] = round((raw.get('rain_24h', 0) / 0.254) * 0.01, 3)
|
raw["rain_24h"] = round((raw.get("rain_24h", 0) / 0.254) * 0.01, 3)
|
||||||
raw['weather']['rain_24h'] = raw['rain_24h']
|
raw["weather"]["rain_24h"] = raw["rain_24h"]
|
||||||
raw['rain_since_midnight'] = round(
|
raw["rain_since_midnight"] = round(
|
||||||
(raw.get('rain_since_midnight', 0) / 0.254) * 0.01, 3
|
(raw.get("rain_since_midnight", 0) / 0.254) * 0.01, 3
|
||||||
)
|
)
|
||||||
raw['weather']['rain_since_midnight'] = raw['rain_since_midnight']
|
raw["weather"]["rain_since_midnight"] = raw["rain_since_midnight"]
|
||||||
|
|
||||||
if 'wind_direction' not in raw:
|
if "wind_direction" not in raw:
|
||||||
wind_direction = raw.get('course')
|
wind_direction = raw.get("course")
|
||||||
if wind_direction:
|
if wind_direction:
|
||||||
raw['wind_direction'] = wind_direction
|
raw["wind_direction"] = wind_direction
|
||||||
raw['weather']['wind_direction'] = raw['wind_direction']
|
raw["weather"]["wind_direction"] = raw["wind_direction"]
|
||||||
if 'course' in raw:
|
if "course" in raw:
|
||||||
del raw['course']
|
del raw["course"]
|
||||||
|
|
||||||
del raw['weather']
|
del raw["weather"]
|
||||||
return raw
|
return raw
|
||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
@ -565,20 +566,20 @@ class WeatherPacket(GPSPacket, DataClassJsonMixin):
|
|||||||
def key(self) -> str:
|
def key(self) -> str:
|
||||||
"""Build a key for finding this packet in a dict."""
|
"""Build a key for finding this packet in a dict."""
|
||||||
if self.raw_timestamp:
|
if self.raw_timestamp:
|
||||||
return f'{self.from_call}:{self.raw_timestamp}'
|
return f"{self.from_call}:{self.raw_timestamp}"
|
||||||
elif self.wx_raw_timestamp:
|
elif self.wx_raw_timestamp:
|
||||||
return f'{self.from_call}:{self.wx_raw_timestamp}'
|
return f"{self.from_call}:{self.wx_raw_timestamp}"
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def human_info(self) -> str:
|
def human_info(self) -> str:
|
||||||
h_str = []
|
h_str = []
|
||||||
h_str.append(f'Temp {self.temperature:03.0f}F')
|
h_str.append(f"Temp {self.temperature:03.0f}F")
|
||||||
h_str.append(f'Humidity {self.humidity}%')
|
h_str.append(f"Humidity {self.humidity}%")
|
||||||
h_str.append(f'Wind {self.wind_speed:03.0f}MPH@{self.wind_direction}')
|
h_str.append(f"Wind {self.wind_speed:03.0f}MPH@{self.wind_direction}")
|
||||||
h_str.append(f'Pressure {self.pressure}mb')
|
h_str.append(f"Pressure {self.pressure}mb")
|
||||||
h_str.append(f'Rain {self.rain_24h}in/24hr')
|
h_str.append(f"Rain {self.rain_24h}in/24hr")
|
||||||
|
|
||||||
return ' '.join(h_str)
|
return " ".join(h_str)
|
||||||
|
|
||||||
def _build_payload(self):
|
def _build_payload(self):
|
||||||
"""Build an uncompressed weather packet
|
"""Build an uncompressed weather packet
|
||||||
@ -608,49 +609,49 @@ class WeatherPacket(GPSPacket, DataClassJsonMixin):
|
|||||||
time_zulu = self._build_time_zulu()
|
time_zulu = self._build_time_zulu()
|
||||||
|
|
||||||
contents = [
|
contents = [
|
||||||
f'@{time_zulu}z{self.latitude}{self.symbol_table}',
|
f"@{time_zulu}z{self.latitude}{self.symbol_table}",
|
||||||
f'{self.longitude}{self.symbol}',
|
f"{self.longitude}{self.symbol}",
|
||||||
f'{self.wind_direction:03d}',
|
f"{self.wind_direction:03d}",
|
||||||
# Speed = sustained 1 minute wind speed in mph
|
# Speed = sustained 1 minute wind speed in mph
|
||||||
f'{self.symbol_table}',
|
f"{self.symbol_table}",
|
||||||
f'{self.wind_speed:03.0f}',
|
f"{self.wind_speed:03.0f}",
|
||||||
# wind gust (peak wind speed in mph in the last 5 minutes)
|
# wind gust (peak wind speed in mph in the last 5 minutes)
|
||||||
f'g{self.wind_gust:03.0f}',
|
f"g{self.wind_gust:03.0f}",
|
||||||
# Temperature in degrees F
|
# Temperature in degrees F
|
||||||
f't{self.temperature:03.0f}',
|
f"t{self.temperature:03.0f}",
|
||||||
# Rainfall (in hundredths of an inch) in the last hour
|
# Rainfall (in hundredths of an inch) in the last hour
|
||||||
f'r{self.rain_1h * 100:03.0f}',
|
f"r{self.rain_1h * 100:03.0f}",
|
||||||
# Rainfall (in hundredths of an inch) in last 24 hours
|
# Rainfall (in hundredths of an inch) in last 24 hours
|
||||||
f'p{self.rain_24h * 100:03.0f}',
|
f"p{self.rain_24h * 100:03.0f}",
|
||||||
# Rainfall (in hundredths of an inch) since midnigt
|
# Rainfall (in hundredths of an inch) since midnigt
|
||||||
f'P{self.rain_since_midnight * 100:03.0f}',
|
f"P{self.rain_since_midnight * 100:03.0f}",
|
||||||
# Humidity
|
# Humidity
|
||||||
f'h{self.humidity:02d}',
|
f"h{self.humidity:02d}",
|
||||||
# Barometric pressure (in tenths of millibars/tenths of hPascal)
|
# Barometric pressure (in tenths of millibars/tenths of hPascal)
|
||||||
f'b{self.pressure:05.0f}',
|
f"b{self.pressure:05.0f}",
|
||||||
]
|
]
|
||||||
if self.comment:
|
if self.comment:
|
||||||
comment = self._filter_for_send(self.comment)
|
comment = self.filter_for_send(self.comment)
|
||||||
contents.append(comment)
|
contents.append(comment)
|
||||||
self.payload = ''.join(contents)
|
self.payload = "".join(contents)
|
||||||
|
|
||||||
def _build_raw(self):
|
def _build_raw(self):
|
||||||
self.raw = f'{self.from_call}>{self.to_call},WIDE1-1,WIDE2-1:{self.payload}'
|
self.raw = f"{self.from_call}>{self.to_call},WIDE1-1,WIDE2-1:" f"{self.payload}"
|
||||||
|
|
||||||
|
|
||||||
@dataclass(unsafe_hash=True)
|
@dataclass(unsafe_hash=True)
|
||||||
class ThirdPartyPacket(Packet, DataClassJsonMixin):
|
class ThirdPartyPacket(Packet, DataClassJsonMixin):
|
||||||
_type: str = 'ThirdPartyPacket'
|
_type: str = "ThirdPartyPacket"
|
||||||
# Holds the encapsulated packet
|
# Holds the encapsulated packet
|
||||||
subpacket: Optional[type[Packet]] = field(default=None, compare=True, hash=False)
|
subpacket: Optional[type[Packet]] = field(default=None, compare=True, hash=False)
|
||||||
|
|
||||||
def __repr__(self):
|
def __repr__(self):
|
||||||
"""Build the repr version of the packet."""
|
"""Build the repr version of the packet."""
|
||||||
repr_str = (
|
repr_str = (
|
||||||
f'{self.__class__.__name__}:'
|
f"{self.__class__.__name__}:"
|
||||||
f' From: {self.from_call} '
|
f" From: {self.from_call} "
|
||||||
f' To: {self.to_call} '
|
f" To: {self.to_call} "
|
||||||
f' Subpacket: {repr(self.subpacket)}'
|
f" Subpacket: {repr(self.subpacket)}"
|
||||||
)
|
)
|
||||||
|
|
||||||
return repr_str
|
return repr_str
|
||||||
@ -664,12 +665,12 @@ class ThirdPartyPacket(Packet, DataClassJsonMixin):
|
|||||||
@property
|
@property
|
||||||
def key(self) -> str:
|
def key(self) -> str:
|
||||||
"""Build a key for finding this packet in a dict."""
|
"""Build a key for finding this packet in a dict."""
|
||||||
return f'{self.from_call}:{self.subpacket.key}'
|
return f"{self.from_call}:{self.subpacket.key}"
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def human_info(self) -> str:
|
def human_info(self) -> str:
|
||||||
sub_info = self.subpacket.human_info
|
sub_info = self.subpacket.human_info
|
||||||
return f'{self.from_call}->{self.to_call} {sub_info}'
|
return f"{self.from_call}->{self.to_call} {sub_info}"
|
||||||
|
|
||||||
|
|
||||||
@dataclass_json(undefined=Undefined.INCLUDE)
|
@dataclass_json(undefined=Undefined.INCLUDE)
|
||||||
@ -681,12 +682,11 @@ class UnknownPacket:
|
|||||||
"""
|
"""
|
||||||
|
|
||||||
unknown_fields: CatchAll
|
unknown_fields: CatchAll
|
||||||
_type: str = 'UnknownPacket'
|
_type: str = "UnknownPacket"
|
||||||
from_call: Optional[str] = field(default=None)
|
from_call: Optional[str] = field(default=None)
|
||||||
to_call: Optional[str] = field(default=None)
|
to_call: Optional[str] = field(default=None)
|
||||||
msgNo: str = field(default_factory=_init_msgNo) # noqa: N815
|
msgNo: str = field(default_factory=_init_msgNo) # noqa: N815
|
||||||
format: Optional[str] = field(default=None)
|
format: Optional[str] = field(default=None)
|
||||||
timestamp: float = field(default_factory=_init_timestamp, compare=False, hash=False)
|
|
||||||
raw: Optional[str] = field(default=None)
|
raw: Optional[str] = field(default=None)
|
||||||
raw_dict: dict = field(
|
raw_dict: dict = field(
|
||||||
repr=False, default_factory=lambda: {}, compare=False, hash=False
|
repr=False, default_factory=lambda: {}, compare=False, hash=False
|
||||||
@ -694,13 +694,11 @@ class UnknownPacket:
|
|||||||
path: List[str] = field(default_factory=list, compare=False, hash=False)
|
path: List[str] = field(default_factory=list, compare=False, hash=False)
|
||||||
packet_type: Optional[str] = field(default=None)
|
packet_type: Optional[str] = field(default=None)
|
||||||
via: Optional[str] = field(default=None, compare=False, hash=False)
|
via: Optional[str] = field(default=None, compare=False, hash=False)
|
||||||
# Was the packet previously processed (for dupe checking)
|
|
||||||
processed: bool = field(repr=False, default=False, compare=False, hash=False)
|
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def key(self) -> str:
|
def key(self) -> str:
|
||||||
"""Build a key for finding this packet in a dict."""
|
"""Build a key for finding this packet in a dict."""
|
||||||
return f'{self.from_call}:{self.packet_type}:{self.to_call}'
|
return f"{self.from_call}:{self.packet_type}:{self.to_call}"
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def human_info(self) -> str:
|
def human_info(self) -> str:
|
||||||
@ -727,20 +725,20 @@ TYPE_LOOKUP: dict[str, type[Packet]] = {
|
|||||||
def get_packet_type(packet: dict) -> str:
|
def get_packet_type(packet: dict) -> str:
|
||||||
"""Decode the packet type from the packet."""
|
"""Decode the packet type from the packet."""
|
||||||
|
|
||||||
pkt_format = packet.get('format')
|
pkt_format = packet.get("format")
|
||||||
msg_response = packet.get('response')
|
msg_response = packet.get("response")
|
||||||
packet_type = PACKET_TYPE_UNKNOWN
|
packet_type = PACKET_TYPE_UNKNOWN
|
||||||
if pkt_format == 'message' and msg_response == 'ack':
|
if pkt_format == "message" and msg_response == "ack":
|
||||||
packet_type = PACKET_TYPE_ACK
|
packet_type = PACKET_TYPE_ACK
|
||||||
elif pkt_format == 'message' and msg_response == 'rej':
|
elif pkt_format == "message" and msg_response == "rej":
|
||||||
packet_type = PACKET_TYPE_REJECT
|
packet_type = PACKET_TYPE_REJECT
|
||||||
elif pkt_format == 'message':
|
elif pkt_format == "message":
|
||||||
packet_type = PACKET_TYPE_MESSAGE
|
packet_type = PACKET_TYPE_MESSAGE
|
||||||
elif pkt_format == 'mic-e':
|
elif pkt_format == "mic-e":
|
||||||
packet_type = PACKET_TYPE_MICE
|
packet_type = PACKET_TYPE_MICE
|
||||||
elif pkt_format == 'object':
|
elif pkt_format == "object":
|
||||||
packet_type = PACKET_TYPE_OBJECT
|
packet_type = PACKET_TYPE_OBJECT
|
||||||
elif pkt_format == 'status':
|
elif pkt_format == "status":
|
||||||
packet_type = PACKET_TYPE_STATUS
|
packet_type = PACKET_TYPE_STATUS
|
||||||
elif pkt_format == PACKET_TYPE_BULLETIN:
|
elif pkt_format == PACKET_TYPE_BULLETIN:
|
||||||
packet_type = PACKET_TYPE_BULLETIN
|
packet_type = PACKET_TYPE_BULLETIN
|
||||||
@ -751,13 +749,13 @@ def get_packet_type(packet: dict) -> str:
|
|||||||
elif pkt_format == PACKET_TYPE_WX:
|
elif pkt_format == PACKET_TYPE_WX:
|
||||||
packet_type = PACKET_TYPE_WEATHER
|
packet_type = PACKET_TYPE_WEATHER
|
||||||
elif pkt_format == PACKET_TYPE_UNCOMPRESSED:
|
elif pkt_format == PACKET_TYPE_UNCOMPRESSED:
|
||||||
if packet.get('symbol') == '_':
|
if packet.get("symbol") == "_":
|
||||||
packet_type = PACKET_TYPE_WEATHER
|
packet_type = PACKET_TYPE_WEATHER
|
||||||
elif pkt_format == PACKET_TYPE_THIRDPARTY:
|
elif pkt_format == PACKET_TYPE_THIRDPARTY:
|
||||||
packet_type = PACKET_TYPE_THIRDPARTY
|
packet_type = PACKET_TYPE_THIRDPARTY
|
||||||
|
|
||||||
if packet_type == PACKET_TYPE_UNKNOWN:
|
if packet_type == PACKET_TYPE_UNKNOWN:
|
||||||
if 'latitude' in packet:
|
if "latitude" in packet:
|
||||||
packet_type = PACKET_TYPE_BEACON
|
packet_type = PACKET_TYPE_BEACON
|
||||||
else:
|
else:
|
||||||
packet_type = PACKET_TYPE_UNKNOWN
|
packet_type = PACKET_TYPE_UNKNOWN
|
||||||
@ -779,32 +777,32 @@ def is_mice_packet(packet: dict[Any, Any]) -> bool:
|
|||||||
def factory(raw_packet: dict[Any, Any]) -> type[Packet]:
|
def factory(raw_packet: dict[Any, Any]) -> type[Packet]:
|
||||||
"""Factory method to create a packet from a raw packet string."""
|
"""Factory method to create a packet from a raw packet string."""
|
||||||
raw = raw_packet
|
raw = raw_packet
|
||||||
if '_type' in raw:
|
if "_type" in raw:
|
||||||
cls = globals()[raw['_type']]
|
cls = globals()[raw["_type"]]
|
||||||
return cls.from_dict(raw)
|
return cls.from_dict(raw)
|
||||||
|
|
||||||
raw['raw_dict'] = raw.copy()
|
raw["raw_dict"] = raw.copy()
|
||||||
raw = _translate_fields(raw)
|
raw = _translate_fields(raw)
|
||||||
|
|
||||||
packet_type = get_packet_type(raw)
|
packet_type = get_packet_type(raw)
|
||||||
|
|
||||||
raw['packet_type'] = packet_type
|
raw["packet_type"] = packet_type
|
||||||
packet_class = TYPE_LOOKUP[packet_type]
|
packet_class = TYPE_LOOKUP[packet_type]
|
||||||
if packet_type == PACKET_TYPE_WX:
|
if packet_type == PACKET_TYPE_WX:
|
||||||
# the weather information is in a dict
|
# the weather information is in a dict
|
||||||
# this brings those values out to the outer dict
|
# this brings those values out to the outer dict
|
||||||
packet_class = WeatherPacket
|
packet_class = WeatherPacket
|
||||||
elif packet_type == PACKET_TYPE_OBJECT and 'weather' in raw:
|
elif packet_type == PACKET_TYPE_OBJECT and "weather" in raw:
|
||||||
packet_class = WeatherPacket
|
packet_class = WeatherPacket
|
||||||
elif packet_type == PACKET_TYPE_UNKNOWN:
|
elif packet_type == PACKET_TYPE_UNKNOWN:
|
||||||
# Try and figure it out here
|
# Try and figure it out here
|
||||||
if 'latitude' in raw:
|
if "latitude" in raw:
|
||||||
packet_class = GPSPacket
|
packet_class = GPSPacket
|
||||||
else:
|
else:
|
||||||
# LOG.warning(raw)
|
# LOG.warning(raw)
|
||||||
packet_class = UnknownPacket
|
packet_class = UnknownPacket
|
||||||
|
|
||||||
raw.get('addresse', raw.get('to_call'))
|
raw.get("addresse", raw.get("to_call"))
|
||||||
|
|
||||||
# TODO: Find a global way to enable/disable this
|
# TODO: Find a global way to enable/disable this
|
||||||
# LOGU.opt(colors=True).info(
|
# LOGU.opt(colors=True).info(
|
||||||
|
@ -1,58 +0,0 @@
|
|||||||
import logging
|
|
||||||
from typing import Callable, Protocol, runtime_checkable, Union, Dict
|
|
||||||
|
|
||||||
from aprsd.packets import core
|
|
||||||
from aprsd.utils import singleton
|
|
||||||
|
|
||||||
LOG = logging.getLogger("APRSD")
|
|
||||||
|
|
||||||
|
|
||||||
@runtime_checkable
|
|
||||||
class PacketFilterProtocol(Protocol):
|
|
||||||
"""Protocol API for a packet filter class.
|
|
||||||
"""
|
|
||||||
def filter(self, packet: type[core.Packet]) -> Union[type[core.Packet], None]:
|
|
||||||
"""When we get a packet from the network.
|
|
||||||
|
|
||||||
Return a Packet object if the filter passes. Return None if the
|
|
||||||
Packet is filtered out.
|
|
||||||
"""
|
|
||||||
...
|
|
||||||
|
|
||||||
|
|
||||||
@singleton
|
|
||||||
class PacketFilter:
|
|
||||||
|
|
||||||
def __init__(self):
|
|
||||||
self.filters: Dict[str, Callable] = {}
|
|
||||||
|
|
||||||
def register(self, packet_filter: Callable) -> None:
|
|
||||||
if not isinstance(packet_filter, PacketFilterProtocol):
|
|
||||||
raise TypeError(f"class {packet_filter} is not a PacketFilterProtocol object")
|
|
||||||
|
|
||||||
if packet_filter not in self.filters:
|
|
||||||
self.filters[packet_filter] = packet_filter()
|
|
||||||
|
|
||||||
def unregister(self, packet_filter: Callable) -> None:
|
|
||||||
if not isinstance(packet_filter, PacketFilterProtocol):
|
|
||||||
raise TypeError(f"class {packet_filter} is not a PacketFilterProtocol object")
|
|
||||||
if packet_filter in self.filters:
|
|
||||||
del self.filters[packet_filter]
|
|
||||||
|
|
||||||
def filter(self, packet: type[core.Packet]) -> Union[type[core.Packet], None]:
|
|
||||||
"""Run through each of the filters.
|
|
||||||
|
|
||||||
This will step through each registered filter class
|
|
||||||
and call filter on it.
|
|
||||||
|
|
||||||
If the filter object returns None, we are done filtering.
|
|
||||||
If the filter object returns the packet, we continue filtering.
|
|
||||||
"""
|
|
||||||
for packet_filter in self.filters:
|
|
||||||
try:
|
|
||||||
if not self.filters[packet_filter].filter(packet):
|
|
||||||
LOG.debug(f"{self.filters[packet_filter].__class__.__name__} dropped {packet.__class__.__name__}:{packet.human_info}")
|
|
||||||
return None
|
|
||||||
except Exception as ex:
|
|
||||||
LOG.error(f"{packet_filter.__clas__.__name__} failed filtering packet {packet.__class__.__name__} : {ex}")
|
|
||||||
return packet
|
|
@ -1,68 +0,0 @@
|
|||||||
import logging
|
|
||||||
from typing import Union
|
|
||||||
|
|
||||||
from oslo_config import cfg
|
|
||||||
|
|
||||||
from aprsd import packets
|
|
||||||
from aprsd.packets import core
|
|
||||||
|
|
||||||
CONF = cfg.CONF
|
|
||||||
LOG = logging.getLogger('APRSD')
|
|
||||||
|
|
||||||
|
|
||||||
class DupePacketFilter:
|
|
||||||
"""This is a packet filter to detect duplicate packets.
|
|
||||||
|
|
||||||
This Uses the PacketList object to see if a packet exists
|
|
||||||
already. If it does exist in the PacketList, then we need to
|
|
||||||
check the flag on the packet to see if it's been processed before.
|
|
||||||
If the packet has been processed already within the allowed
|
|
||||||
timeframe, then it's a dupe.
|
|
||||||
"""
|
|
||||||
|
|
||||||
def filter(self, packet: type[core.Packet]) -> Union[type[core.Packet], None]:
|
|
||||||
# LOG.debug(f"{self.__class__.__name__}.filter called for packet {packet}")
|
|
||||||
"""Filter a packet out if it's already been seen and processed."""
|
|
||||||
if isinstance(packet, core.AckPacket):
|
|
||||||
# We don't need to drop AckPackets, those should be
|
|
||||||
# processed.
|
|
||||||
# Send the AckPacket to the queue for processing elsewhere.
|
|
||||||
return packet
|
|
||||||
else:
|
|
||||||
# Make sure we aren't re-processing the same packet
|
|
||||||
# For RF based APRS Clients we can get duplicate packets
|
|
||||||
# So we need to track them and not process the dupes.
|
|
||||||
pkt_list = packets.PacketList()
|
|
||||||
found = False
|
|
||||||
try:
|
|
||||||
# Find the packet in the list of already seen packets
|
|
||||||
# Based on the packet.key
|
|
||||||
found = pkt_list.find(packet)
|
|
||||||
if not packet.msgNo:
|
|
||||||
# If the packet doesn't have a message id
|
|
||||||
# then there is no reliable way to detect
|
|
||||||
# if it's a dupe, so we just pass it on.
|
|
||||||
# it shouldn't get acked either.
|
|
||||||
found = False
|
|
||||||
except KeyError:
|
|
||||||
found = False
|
|
||||||
|
|
||||||
if not found:
|
|
||||||
# We haven't seen this packet before, so we process it.
|
|
||||||
return packet
|
|
||||||
|
|
||||||
if not packet.processed:
|
|
||||||
# We haven't processed this packet through the plugins.
|
|
||||||
return packet
|
|
||||||
elif packet.timestamp - found.timestamp < CONF.packet_dupe_timeout:
|
|
||||||
# If the packet came in within N seconds of the
|
|
||||||
# Last time seeing the packet, then we drop it as a dupe.
|
|
||||||
LOG.warning(
|
|
||||||
f'Packet {packet.from_call}:{packet.msgNo} already tracked, dropping.'
|
|
||||||
)
|
|
||||||
else:
|
|
||||||
LOG.warning(
|
|
||||||
f'Packet {packet.from_call}:{packet.msgNo} already tracked '
|
|
||||||
f'but older than {CONF.packet_dupe_timeout} seconds. processing.',
|
|
||||||
)
|
|
||||||
return packet
|
|
@ -1,53 +0,0 @@
|
|||||||
import logging
|
|
||||||
from typing import Union
|
|
||||||
|
|
||||||
from oslo_config import cfg
|
|
||||||
|
|
||||||
from aprsd import packets
|
|
||||||
from aprsd.packets import core
|
|
||||||
from aprsd.utils import singleton
|
|
||||||
|
|
||||||
CONF = cfg.CONF
|
|
||||||
LOG = logging.getLogger('APRSD')
|
|
||||||
|
|
||||||
|
|
||||||
@singleton
|
|
||||||
class PacketTypeFilter:
|
|
||||||
"""This filter is used to filter out packets that don't match a specific type.
|
|
||||||
|
|
||||||
To use this, register it with the PacketFilter class,
|
|
||||||
then instante it and call set_allow_list() with a list of packet types
|
|
||||||
you want to allow to pass the filtering. All other packets will be
|
|
||||||
filtered out.
|
|
||||||
"""
|
|
||||||
|
|
||||||
filters = {
|
|
||||||
packets.Packet.__name__: packets.Packet,
|
|
||||||
packets.AckPacket.__name__: packets.AckPacket,
|
|
||||||
packets.BeaconPacket.__name__: packets.BeaconPacket,
|
|
||||||
packets.GPSPacket.__name__: packets.GPSPacket,
|
|
||||||
packets.MessagePacket.__name__: packets.MessagePacket,
|
|
||||||
packets.MicEPacket.__name__: packets.MicEPacket,
|
|
||||||
packets.ObjectPacket.__name__: packets.ObjectPacket,
|
|
||||||
packets.StatusPacket.__name__: packets.StatusPacket,
|
|
||||||
packets.ThirdPartyPacket.__name__: packets.ThirdPartyPacket,
|
|
||||||
packets.WeatherPacket.__name__: packets.WeatherPacket,
|
|
||||||
packets.UnknownPacket.__name__: packets.UnknownPacket,
|
|
||||||
}
|
|
||||||
|
|
||||||
allow_list = ()
|
|
||||||
|
|
||||||
def set_allow_list(self, filter_list):
|
|
||||||
tmp_list = []
|
|
||||||
for filter in filter_list:
|
|
||||||
LOG.warning(
|
|
||||||
f'Setting filter {filter} : {self.filters[filter]} to tmp {tmp_list}'
|
|
||||||
)
|
|
||||||
tmp_list.append(self.filters[filter])
|
|
||||||
self.allow_list = tuple(tmp_list)
|
|
||||||
|
|
||||||
def filter(self, packet: type[core.Packet]) -> Union[type[core.Packet], None]:
|
|
||||||
"""Only allow packets of certain types to filter through."""
|
|
||||||
if self.allow_list:
|
|
||||||
if isinstance(packet, self.allow_list):
|
|
||||||
return packet
|
|
@ -12,13 +12,13 @@ LOG = logging.getLogger()
|
|||||||
LOGU = logger
|
LOGU = logger
|
||||||
CONF = cfg.CONF
|
CONF = cfg.CONF
|
||||||
|
|
||||||
FROM_COLOR = 'fg #C70039'
|
FROM_COLOR = "fg #C70039"
|
||||||
TO_COLOR = 'fg #D033FF'
|
TO_COLOR = "fg #D033FF"
|
||||||
TX_COLOR = 'red'
|
TX_COLOR = "red"
|
||||||
RX_COLOR = 'green'
|
RX_COLOR = "green"
|
||||||
PACKET_COLOR = 'cyan'
|
PACKET_COLOR = "cyan"
|
||||||
DISTANCE_COLOR = 'fg #FF5733'
|
DISTANCE_COLOR = "fg #FF5733"
|
||||||
DEGREES_COLOR = 'fg #FFA900'
|
DEGREES_COLOR = "fg #FFA900"
|
||||||
|
|
||||||
|
|
||||||
def log_multiline(
|
def log_multiline(
|
||||||
@ -27,11 +27,11 @@ def log_multiline(
|
|||||||
"""LOG a packet to the logfile."""
|
"""LOG a packet to the logfile."""
|
||||||
if not CONF.enable_packet_logging:
|
if not CONF.enable_packet_logging:
|
||||||
return
|
return
|
||||||
if CONF.log_packet_format == 'compact':
|
if CONF.log_packet_format == "compact":
|
||||||
return
|
return
|
||||||
|
|
||||||
# asdict(packet)
|
# asdict(packet)
|
||||||
logit = ['\n']
|
logit = ["\n"]
|
||||||
name = packet.__class__.__name__
|
name = packet.__class__.__name__
|
||||||
|
|
||||||
if isinstance(packet, AckPacket):
|
if isinstance(packet, AckPacket):
|
||||||
@ -41,67 +41,57 @@ def log_multiline(
|
|||||||
|
|
||||||
if header:
|
if header:
|
||||||
if tx:
|
if tx:
|
||||||
header_str = f'<{TX_COLOR}>TX</{TX_COLOR}>'
|
header_str = f"<{TX_COLOR}>TX</{TX_COLOR}>"
|
||||||
logit.append(
|
logit.append(
|
||||||
f'{header_str}________(<{PACKET_COLOR}>{name}</{PACKET_COLOR}> '
|
f"{header_str}________(<{PACKET_COLOR}>{name}</{PACKET_COLOR}> "
|
||||||
f'TX:{packet.send_count + 1} of {pkt_max_send_count}',
|
f"TX:{packet.send_count + 1} of {pkt_max_send_count}",
|
||||||
)
|
)
|
||||||
else:
|
else:
|
||||||
header_str = f'<{RX_COLOR}>RX</{RX_COLOR}>'
|
header_str = f"<{RX_COLOR}>RX</{RX_COLOR}>"
|
||||||
logit.append(
|
logit.append(
|
||||||
f'{header_str}________(<{PACKET_COLOR}>{name}</{PACKET_COLOR}>)',
|
f"{header_str}________(<{PACKET_COLOR}>{name}</{PACKET_COLOR}>)",
|
||||||
)
|
)
|
||||||
|
|
||||||
else:
|
else:
|
||||||
header_str = ''
|
header_str = ""
|
||||||
logit.append(f'__________(<{PACKET_COLOR}>{name}</{PACKET_COLOR}>)')
|
logit.append(f"__________(<{PACKET_COLOR}>{name}</{PACKET_COLOR}>)")
|
||||||
# log_list.append(f" Packet : {packet.__class__.__name__}")
|
# log_list.append(f" Packet : {packet.__class__.__name__}")
|
||||||
if packet.msgNo:
|
if packet.msgNo:
|
||||||
logit.append(f' Msg # : {packet.msgNo}')
|
logit.append(f" Msg # : {packet.msgNo}")
|
||||||
if packet.from_call:
|
if packet.from_call:
|
||||||
logit.append(f' From : <{FROM_COLOR}>{packet.from_call}</{FROM_COLOR}>')
|
logit.append(f" From : <{FROM_COLOR}>{packet.from_call}</{FROM_COLOR}>")
|
||||||
if packet.to_call:
|
if packet.to_call:
|
||||||
logit.append(f' To : <{TO_COLOR}>{packet.to_call}</{TO_COLOR}>')
|
logit.append(f" To : <{TO_COLOR}>{packet.to_call}</{TO_COLOR}>")
|
||||||
if hasattr(packet, 'path') and packet.path:
|
if hasattr(packet, "path") and packet.path:
|
||||||
logit.append(f' Path : {"=>".join(packet.path)}')
|
logit.append(f" Path : {'=>'.join(packet.path)}")
|
||||||
if hasattr(packet, 'via') and packet.via:
|
if hasattr(packet, "via") and packet.via:
|
||||||
logit.append(f' VIA : {packet.via}')
|
logit.append(f" VIA : {packet.via}")
|
||||||
|
|
||||||
if not isinstance(packet, AckPacket) and not isinstance(packet, RejectPacket):
|
if not isinstance(packet, AckPacket) and not isinstance(packet, RejectPacket):
|
||||||
msg = packet.human_info
|
msg = packet.human_info
|
||||||
|
|
||||||
if msg:
|
if msg:
|
||||||
msg = msg.replace('<', '\\<')
|
msg = msg.replace("<", "\\<")
|
||||||
logit.append(f' Info : <light-yellow><b>{msg}</b></light-yellow>')
|
logit.append(f" Info : <light-yellow><b>{msg}</b></light-yellow>")
|
||||||
|
|
||||||
if hasattr(packet, 'comment') and packet.comment:
|
if hasattr(packet, "comment") and packet.comment:
|
||||||
logit.append(f' Comment : {packet.comment}')
|
logit.append(f" Comment : {packet.comment}")
|
||||||
|
|
||||||
raw = packet.raw.replace('<', '\\<')
|
raw = packet.raw.replace("<", "\\<")
|
||||||
logit.append(f' Raw : <fg #828282>{raw}</fg #828282>')
|
logit.append(f" Raw : <fg #828282>{raw}</fg #828282>")
|
||||||
logit.append(f'{header_str}________(<{PACKET_COLOR}>{name}</{PACKET_COLOR}>)')
|
logit.append(f"{header_str}________(<{PACKET_COLOR}>{name}</{PACKET_COLOR}>)")
|
||||||
|
|
||||||
LOGU.opt(colors=True).info('\n'.join(logit))
|
LOGU.opt(colors=True).info("\n".join(logit))
|
||||||
LOG.debug(repr(packet))
|
LOG.debug(repr(packet))
|
||||||
|
|
||||||
|
|
||||||
def log(
|
def log(packet, tx: Optional[bool] = False, header: Optional[bool] = True) -> None:
|
||||||
packet,
|
|
||||||
tx: Optional[bool] = False,
|
|
||||||
header: Optional[bool] = True,
|
|
||||||
packet_count: Optional[int] = None,
|
|
||||||
) -> None:
|
|
||||||
if not CONF.enable_packet_logging:
|
if not CONF.enable_packet_logging:
|
||||||
return
|
return
|
||||||
if CONF.log_packet_format == 'multiline':
|
if CONF.log_packet_format == "multiline":
|
||||||
log_multiline(packet, tx, header)
|
log_multiline(packet, tx, header)
|
||||||
return
|
return
|
||||||
|
|
||||||
if not packet_count:
|
|
||||||
packet_count = ''
|
|
||||||
else:
|
|
||||||
packet_count = f'({packet_count:d})'
|
|
||||||
|
|
||||||
logit = []
|
logit = []
|
||||||
name = packet.__class__.__name__
|
name = packet.__class__.__name__
|
||||||
if isinstance(packet, AckPacket):
|
if isinstance(packet, AckPacket):
|
||||||
@ -111,47 +101,47 @@ def log(
|
|||||||
|
|
||||||
if header:
|
if header:
|
||||||
if tx:
|
if tx:
|
||||||
via_color = 'red'
|
via_color = "red"
|
||||||
arrow = f'<{via_color}>\u2192</{via_color}>'
|
arrow = f"<{via_color}>\u2192</{via_color}>"
|
||||||
logit.append(
|
logit.append(
|
||||||
f'<red>TX{packet_count}\u2191</red> '
|
f"<red>TX\u2191</red> "
|
||||||
f'<cyan>{name}</cyan>'
|
f"<cyan>{name}</cyan>"
|
||||||
f':{packet.msgNo}'
|
f":{packet.msgNo}"
|
||||||
f' ({packet.send_count + 1} of {pkt_max_send_count})',
|
f" ({packet.send_count + 1} of {pkt_max_send_count})",
|
||||||
)
|
)
|
||||||
else:
|
else:
|
||||||
via_color = 'fg #1AA730'
|
via_color = "fg #1AA730"
|
||||||
arrow = f'<{via_color}>\u2192</{via_color}>'
|
arrow = f"<{via_color}>\u2192</{via_color}>"
|
||||||
f'<{via_color}><-</{via_color}>'
|
f"<{via_color}><-</{via_color}>"
|
||||||
logit.append(
|
logit.append(
|
||||||
f'<fg #1AA730>RX{packet_count}\u2193</fg #1AA730> '
|
f"<fg #1AA730>RX\u2193</fg #1AA730> "
|
||||||
f'<cyan>{name}</cyan>'
|
f"<cyan>{name}</cyan>"
|
||||||
f':{packet.msgNo}',
|
f":{packet.msgNo}",
|
||||||
)
|
)
|
||||||
else:
|
else:
|
||||||
via_color = 'green'
|
via_color = "green"
|
||||||
arrow = f'<{via_color}>-></{via_color}>'
|
arrow = f"<{via_color}>-></{via_color}>"
|
||||||
logit.append(
|
logit.append(
|
||||||
f'<cyan>{name}</cyan>:{packet.msgNo}',
|
f"<cyan>{name}</cyan>" f":{packet.msgNo}",
|
||||||
)
|
)
|
||||||
|
|
||||||
tmp = None
|
tmp = None
|
||||||
if packet.path:
|
if packet.path:
|
||||||
tmp = f'{arrow}'.join(packet.path) + f'{arrow} '
|
tmp = f"{arrow}".join(packet.path) + f"{arrow} "
|
||||||
|
|
||||||
logit.append(
|
logit.append(
|
||||||
f'<{FROM_COLOR}>{packet.from_call}</{FROM_COLOR}> {arrow}'
|
f"<{FROM_COLOR}>{packet.from_call}</{FROM_COLOR}> {arrow}"
|
||||||
f'{tmp if tmp else " "}'
|
f"{tmp if tmp else ' '}"
|
||||||
f'<{TO_COLOR}>{packet.to_call}</{TO_COLOR}>',
|
f"<{TO_COLOR}>{packet.to_call}</{TO_COLOR}>",
|
||||||
)
|
)
|
||||||
|
|
||||||
if not isinstance(packet, AckPacket) and not isinstance(packet, RejectPacket):
|
if not isinstance(packet, AckPacket) and not isinstance(packet, RejectPacket):
|
||||||
logit.append(':')
|
logit.append(":")
|
||||||
msg = packet.human_info
|
msg = packet.human_info
|
||||||
|
|
||||||
if msg:
|
if msg:
|
||||||
msg = msg.replace('<', '\\<')
|
msg = msg.replace("<", "\\<")
|
||||||
logit.append(f'<light-yellow><b>{msg}</b></light-yellow>')
|
logit.append(f"<light-yellow><b>{msg}</b></light-yellow>")
|
||||||
|
|
||||||
# is there distance information?
|
# is there distance information?
|
||||||
if isinstance(packet, GPSPacket) and CONF.latitude and CONF.longitude:
|
if isinstance(packet, GPSPacket) and CONF.latitude and CONF.longitude:
|
||||||
@ -160,12 +150,12 @@ def log(
|
|||||||
try:
|
try:
|
||||||
bearing = utils.calculate_initial_compass_bearing(my_coords, packet_coords)
|
bearing = utils.calculate_initial_compass_bearing(my_coords, packet_coords)
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
LOG.error(f'Failed to calculate bearing: {e}')
|
LOG.error(f"Failed to calculate bearing: {e}")
|
||||||
bearing = 0
|
bearing = 0
|
||||||
logit.append(
|
logit.append(
|
||||||
f' : <{DEGREES_COLOR}>{utils.degrees_to_cardinal(bearing, full_string=True)}</{DEGREES_COLOR}>'
|
f" : <{DEGREES_COLOR}>{utils.degrees_to_cardinal(bearing, full_string=True)}</{DEGREES_COLOR}>"
|
||||||
f'<{DISTANCE_COLOR}>@{haversine(my_coords, packet_coords, unit=Unit.MILES):.2f}miles</{DISTANCE_COLOR}>',
|
f"<{DISTANCE_COLOR}>@{haversine(my_coords, packet_coords, unit=Unit.MILES):.2f}miles</{DISTANCE_COLOR}>",
|
||||||
)
|
)
|
||||||
|
|
||||||
LOGU.opt(colors=True).info(' '.join(logit))
|
LOGU.opt(colors=True).info(" ".join(logit))
|
||||||
log_multiline(packet, tx, header)
|
log_multiline(packet, tx, header)
|
||||||
|
@ -7,7 +7,7 @@ from aprsd.packets import core
|
|||||||
from aprsd.utils import objectstore
|
from aprsd.utils import objectstore
|
||||||
|
|
||||||
CONF = cfg.CONF
|
CONF = cfg.CONF
|
||||||
LOG = logging.getLogger('APRSD')
|
LOG = logging.getLogger("APRSD")
|
||||||
|
|
||||||
|
|
||||||
class PacketList(objectstore.ObjectStoreMixin):
|
class PacketList(objectstore.ObjectStoreMixin):
|
||||||
@ -27,8 +27,8 @@ class PacketList(objectstore.ObjectStoreMixin):
|
|||||||
|
|
||||||
def _init_data(self):
|
def _init_data(self):
|
||||||
self.data = {
|
self.data = {
|
||||||
'types': {},
|
"types": {},
|
||||||
'packets': OrderedDict(),
|
"packets": OrderedDict(),
|
||||||
}
|
}
|
||||||
|
|
||||||
def rx(self, packet: type[core.Packet]):
|
def rx(self, packet: type[core.Packet]):
|
||||||
@ -37,11 +37,11 @@ class PacketList(objectstore.ObjectStoreMixin):
|
|||||||
self._total_rx += 1
|
self._total_rx += 1
|
||||||
self._add(packet)
|
self._add(packet)
|
||||||
ptype = packet.__class__.__name__
|
ptype = packet.__class__.__name__
|
||||||
type_stats = self.data['types'].setdefault(
|
type_stats = self.data["types"].setdefault(
|
||||||
ptype,
|
ptype,
|
||||||
{'tx': 0, 'rx': 0},
|
{"tx": 0, "rx": 0},
|
||||||
)
|
)
|
||||||
type_stats['rx'] += 1
|
type_stats["rx"] += 1
|
||||||
|
|
||||||
def tx(self, packet: type[core.Packet]):
|
def tx(self, packet: type[core.Packet]):
|
||||||
"""Add a packet that was received."""
|
"""Add a packet that was received."""
|
||||||
@ -49,32 +49,32 @@ class PacketList(objectstore.ObjectStoreMixin):
|
|||||||
self._total_tx += 1
|
self._total_tx += 1
|
||||||
self._add(packet)
|
self._add(packet)
|
||||||
ptype = packet.__class__.__name__
|
ptype = packet.__class__.__name__
|
||||||
type_stats = self.data['types'].setdefault(
|
type_stats = self.data["types"].setdefault(
|
||||||
ptype,
|
ptype,
|
||||||
{'tx': 0, 'rx': 0},
|
{"tx": 0, "rx": 0},
|
||||||
)
|
)
|
||||||
type_stats['tx'] += 1
|
type_stats["tx"] += 1
|
||||||
|
|
||||||
def add(self, packet):
|
def add(self, packet):
|
||||||
with self.lock:
|
with self.lock:
|
||||||
self._add(packet)
|
self._add(packet)
|
||||||
|
|
||||||
def _add(self, packet):
|
def _add(self, packet):
|
||||||
if not self.data.get('packets'):
|
if not self.data.get("packets"):
|
||||||
self._init_data()
|
self._init_data()
|
||||||
if packet.key in self.data['packets']:
|
if packet.key in self.data["packets"]:
|
||||||
self.data['packets'].move_to_end(packet.key)
|
self.data["packets"].move_to_end(packet.key)
|
||||||
elif len(self.data['packets']) == self.maxlen:
|
elif len(self.data["packets"]) == self.maxlen:
|
||||||
self.data['packets'].popitem(last=False)
|
self.data["packets"].popitem(last=False)
|
||||||
self.data['packets'][packet.key] = packet
|
self.data["packets"][packet.key] = packet
|
||||||
|
|
||||||
def find(self, packet):
|
def find(self, packet):
|
||||||
with self.lock:
|
with self.lock:
|
||||||
return self.data['packets'][packet.key]
|
return self.data["packets"][packet.key]
|
||||||
|
|
||||||
def __len__(self):
|
def __len__(self):
|
||||||
with self.lock:
|
with self.lock:
|
||||||
return len(self.data['packets'])
|
return len(self.data["packets"])
|
||||||
|
|
||||||
def total_rx(self):
|
def total_rx(self):
|
||||||
with self.lock:
|
with self.lock:
|
||||||
@ -87,23 +87,17 @@ class PacketList(objectstore.ObjectStoreMixin):
|
|||||||
def stats(self, serializable=False) -> dict:
|
def stats(self, serializable=False) -> dict:
|
||||||
with self.lock:
|
with self.lock:
|
||||||
# Get last N packets directly using list slicing
|
# Get last N packets directly using list slicing
|
||||||
if CONF.packet_list_stats_maxlen >= 0:
|
packets_list = list(self.data.get("packets", {}).values())
|
||||||
packets_list = list(self.data.get('packets', {}).values())
|
pkts = packets_list[-CONF.packet_list_stats_maxlen :][::-1]
|
||||||
pkts = packets_list[-CONF.packet_list_stats_maxlen :][::-1]
|
|
||||||
else:
|
|
||||||
# We have to copy here, because this get() results in a pointer
|
|
||||||
# to the packets internally here, which can change after this
|
|
||||||
# function returns, which would cause a problem trying to save
|
|
||||||
# the stats to disk.
|
|
||||||
pkts = self.data.get('packets', {}).copy()
|
|
||||||
stats = {
|
stats = {
|
||||||
'total_tracked': self._total_rx
|
"total_tracked": self._total_rx
|
||||||
+ self._total_tx, # Fixed typo: was rx + rx
|
+ self._total_tx, # Fixed typo: was rx + rx
|
||||||
'rx': self._total_rx,
|
"rx": self._total_rx,
|
||||||
'tx': self._total_tx,
|
"tx": self._total_tx,
|
||||||
'types': self.data.get('types', {}), # Changed default from [] to {}
|
"types": self.data.get("types", {}), # Changed default from [] to {}
|
||||||
'packet_count': len(self.data.get('packets', [])),
|
"packet_count": len(self.data.get("packets", [])),
|
||||||
'maxlen': self.maxlen,
|
"maxlen": self.maxlen,
|
||||||
'packets': pkts,
|
"packets": pkts,
|
||||||
}
|
}
|
||||||
return stats
|
return stats
|
||||||
|
109
aprsd/plugin.py
109
aprsd/plugin.py
@ -12,30 +12,29 @@ import pluggy
|
|||||||
from oslo_config import cfg
|
from oslo_config import cfg
|
||||||
|
|
||||||
import aprsd
|
import aprsd
|
||||||
from aprsd import packets, threads
|
from aprsd import client, packets, threads
|
||||||
from aprsd.client.client import APRSDClient
|
|
||||||
from aprsd.packets import watch_list
|
from aprsd.packets import watch_list
|
||||||
|
|
||||||
# setup the global logger
|
# setup the global logger
|
||||||
CONF = cfg.CONF
|
CONF = cfg.CONF
|
||||||
LOG = logging.getLogger('APRSD')
|
LOG = logging.getLogger("APRSD")
|
||||||
|
|
||||||
CORE_MESSAGE_PLUGINS = [
|
CORE_MESSAGE_PLUGINS = [
|
||||||
'aprsd.plugins.email.EmailPlugin',
|
"aprsd.plugins.email.EmailPlugin",
|
||||||
'aprsd.plugins.fortune.FortunePlugin',
|
"aprsd.plugins.fortune.FortunePlugin",
|
||||||
'aprsd.plugins.location.LocationPlugin',
|
"aprsd.plugins.location.LocationPlugin",
|
||||||
'aprsd.plugins.ping.PingPlugin',
|
"aprsd.plugins.ping.PingPlugin",
|
||||||
'aprsd.plugins.time.TimePlugin',
|
"aprsd.plugins.time.TimePlugin",
|
||||||
'aprsd.plugins.weather.USWeatherPlugin',
|
"aprsd.plugins.weather.USWeatherPlugin",
|
||||||
'aprsd.plugins.version.VersionPlugin',
|
"aprsd.plugins.version.VersionPlugin",
|
||||||
]
|
]
|
||||||
|
|
||||||
CORE_NOTIFY_PLUGINS = [
|
CORE_NOTIFY_PLUGINS = [
|
||||||
'aprsd.plugins.notify.NotifySeenPlugin',
|
"aprsd.plugins.notify.NotifySeenPlugin",
|
||||||
]
|
]
|
||||||
|
|
||||||
hookspec = pluggy.HookspecMarker('aprsd')
|
hookspec = pluggy.HookspecMarker("aprsd")
|
||||||
hookimpl = pluggy.HookimplMarker('aprsd')
|
hookimpl = pluggy.HookimplMarker("aprsd")
|
||||||
|
|
||||||
|
|
||||||
class APRSDPluginSpec:
|
class APRSDPluginSpec:
|
||||||
@ -77,14 +76,14 @@ class APRSDPluginBase(metaclass=abc.ABCMeta):
|
|||||||
else:
|
else:
|
||||||
LOG.error(
|
LOG.error(
|
||||||
"Can't start thread {}:{}, Must be a child "
|
"Can't start thread {}:{}, Must be a child "
|
||||||
'of aprsd.threads.APRSDThread'.format(
|
"of aprsd.threads.APRSDThread".format(
|
||||||
self,
|
self,
|
||||||
thread,
|
thread,
|
||||||
),
|
),
|
||||||
)
|
)
|
||||||
except Exception:
|
except Exception:
|
||||||
LOG.error(
|
LOG.error(
|
||||||
'Failed to start threads for plugin {}'.format(
|
"Failed to start threads for plugin {}".format(
|
||||||
self,
|
self,
|
||||||
),
|
),
|
||||||
)
|
)
|
||||||
@ -94,7 +93,7 @@ class APRSDPluginBase(metaclass=abc.ABCMeta):
|
|||||||
return self.message_counter
|
return self.message_counter
|
||||||
|
|
||||||
def help(self) -> str:
|
def help(self) -> str:
|
||||||
return 'Help!'
|
return "Help!"
|
||||||
|
|
||||||
@abc.abstractmethod
|
@abc.abstractmethod
|
||||||
def setup(self):
|
def setup(self):
|
||||||
@ -147,11 +146,11 @@ class APRSDWatchListPluginBase(APRSDPluginBase, metaclass=abc.ABCMeta):
|
|||||||
watch_list = CONF.watch_list.callsigns
|
watch_list = CONF.watch_list.callsigns
|
||||||
# make sure the timeout is set or this doesn't work
|
# make sure the timeout is set or this doesn't work
|
||||||
if watch_list:
|
if watch_list:
|
||||||
aprs_client = APRSDClient()
|
aprs_client = client.client_factory.create().client
|
||||||
filter_str = 'b/{}'.format('/'.join(watch_list))
|
filter_str = "b/{}".format("/".join(watch_list))
|
||||||
aprs_client.set_filter(filter_str)
|
aprs_client.set_filter(filter_str)
|
||||||
else:
|
else:
|
||||||
LOG.warning('Watch list enabled, but no callsigns set.')
|
LOG.warning("Watch list enabled, but no callsigns set.")
|
||||||
|
|
||||||
@hookimpl
|
@hookimpl
|
||||||
def filter(self, packet: type[packets.Packet]) -> str | packets.MessagePacket:
|
def filter(self, packet: type[packets.Packet]) -> str | packets.MessagePacket:
|
||||||
@ -165,7 +164,7 @@ class APRSDWatchListPluginBase(APRSDPluginBase, metaclass=abc.ABCMeta):
|
|||||||
result = self.process(packet)
|
result = self.process(packet)
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
LOG.error(
|
LOG.error(
|
||||||
'Plugin {} failed to process packet {}'.format(
|
"Plugin {} failed to process packet {}".format(
|
||||||
self.__class__,
|
self.__class__,
|
||||||
ex,
|
ex,
|
||||||
),
|
),
|
||||||
@ -173,7 +172,7 @@ class APRSDWatchListPluginBase(APRSDPluginBase, metaclass=abc.ABCMeta):
|
|||||||
if result:
|
if result:
|
||||||
self.tx_inc()
|
self.tx_inc()
|
||||||
else:
|
else:
|
||||||
LOG.warning(f'{self.__class__} plugin is not enabled')
|
LOG.warning(f"{self.__class__} plugin is not enabled")
|
||||||
|
|
||||||
return result
|
return result
|
||||||
|
|
||||||
@ -197,7 +196,7 @@ class APRSDRegexCommandPluginBase(APRSDPluginBase, metaclass=abc.ABCMeta):
|
|||||||
raise NotImplementedError
|
raise NotImplementedError
|
||||||
|
|
||||||
def help(self):
|
def help(self):
|
||||||
return '{}: {}'.format(
|
return "{}: {}".format(
|
||||||
self.command_name.lower(),
|
self.command_name.lower(),
|
||||||
self.command_regex,
|
self.command_regex,
|
||||||
)
|
)
|
||||||
@ -208,7 +207,7 @@ class APRSDRegexCommandPluginBase(APRSDPluginBase, metaclass=abc.ABCMeta):
|
|||||||
|
|
||||||
@hookimpl
|
@hookimpl
|
||||||
def filter(self, packet: packets.MessagePacket) -> str | packets.MessagePacket:
|
def filter(self, packet: packets.MessagePacket) -> str | packets.MessagePacket:
|
||||||
LOG.debug(f'{self.__class__.__name__} called')
|
LOG.debug(f"{self.__class__.__name__} called")
|
||||||
if not self.enabled:
|
if not self.enabled:
|
||||||
result = f"{self.__class__.__name__} isn't enabled"
|
result = f"{self.__class__.__name__} isn't enabled"
|
||||||
LOG.warning(result)
|
LOG.warning(result)
|
||||||
@ -216,7 +215,7 @@ class APRSDRegexCommandPluginBase(APRSDPluginBase, metaclass=abc.ABCMeta):
|
|||||||
|
|
||||||
if not isinstance(packet, packets.MessagePacket):
|
if not isinstance(packet, packets.MessagePacket):
|
||||||
LOG.warning(
|
LOG.warning(
|
||||||
f'{self.__class__.__name__} Got a {packet.__class__.__name__} ignoring'
|
f"{self.__class__.__name__} Got a {packet.__class__.__name__} ignoring"
|
||||||
)
|
)
|
||||||
return packets.NULL_MESSAGE
|
return packets.NULL_MESSAGE
|
||||||
|
|
||||||
@ -238,7 +237,7 @@ class APRSDRegexCommandPluginBase(APRSDPluginBase, metaclass=abc.ABCMeta):
|
|||||||
result = self.process(packet)
|
result = self.process(packet)
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
LOG.error(
|
LOG.error(
|
||||||
'Plugin {} failed to process packet {}'.format(
|
"Plugin {} failed to process packet {}".format(
|
||||||
self.__class__,
|
self.__class__,
|
||||||
ex,
|
ex,
|
||||||
),
|
),
|
||||||
@ -255,7 +254,7 @@ class APRSFIKEYMixin:
|
|||||||
|
|
||||||
def ensure_aprs_fi_key(self):
|
def ensure_aprs_fi_key(self):
|
||||||
if not CONF.aprs_fi.apiKey:
|
if not CONF.aprs_fi.apiKey:
|
||||||
LOG.error('Config aprs_fi.apiKey is not set')
|
LOG.error("Config aprs_fi.apiKey is not set")
|
||||||
self.enabled = False
|
self.enabled = False
|
||||||
else:
|
else:
|
||||||
self.enabled = True
|
self.enabled = True
|
||||||
@ -267,25 +266,25 @@ class HelpPlugin(APRSDRegexCommandPluginBase):
|
|||||||
This plugin is in this file to prevent a circular import.
|
This plugin is in this file to prevent a circular import.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
command_regex = '^[hH]'
|
command_regex = "^[hH]"
|
||||||
command_name = 'help'
|
command_name = "help"
|
||||||
|
|
||||||
def help(self):
|
def help(self):
|
||||||
return 'Help: send APRS help or help <plugin>'
|
return "Help: send APRS help or help <plugin>"
|
||||||
|
|
||||||
def process(self, packet: packets.MessagePacket):
|
def process(self, packet: packets.MessagePacket):
|
||||||
LOG.info('HelpPlugin')
|
LOG.info("HelpPlugin")
|
||||||
# fromcall = packet.get("from")
|
# fromcall = packet.get("from")
|
||||||
message = packet.message_text
|
message = packet.message_text
|
||||||
# ack = packet.get("msgNo", "0")
|
# ack = packet.get("msgNo", "0")
|
||||||
a = re.search(r'^.*\s+(.*)', message)
|
a = re.search(r"^.*\s+(.*)", message)
|
||||||
command_name = None
|
command_name = None
|
||||||
if a is not None:
|
if a is not None:
|
||||||
command_name = a.group(1).lower()
|
command_name = a.group(1).lower()
|
||||||
|
|
||||||
pm = PluginManager()
|
pm = PluginManager()
|
||||||
|
|
||||||
if command_name and '?' not in command_name:
|
if command_name and "?" not in command_name:
|
||||||
# user wants help for a specific plugin
|
# user wants help for a specific plugin
|
||||||
reply = None
|
reply = None
|
||||||
for p in pm.get_plugins():
|
for p in pm.get_plugins():
|
||||||
@ -304,20 +303,20 @@ class HelpPlugin(APRSDRegexCommandPluginBase):
|
|||||||
LOG.debug(p)
|
LOG.debug(p)
|
||||||
if p.enabled and isinstance(p, APRSDRegexCommandPluginBase):
|
if p.enabled and isinstance(p, APRSDRegexCommandPluginBase):
|
||||||
name = p.command_name.lower()
|
name = p.command_name.lower()
|
||||||
if name not in list and 'help' not in name:
|
if name not in list and "help" not in name:
|
||||||
list.append(name)
|
list.append(name)
|
||||||
|
|
||||||
list.sort()
|
list.sort()
|
||||||
reply = ' '.join(list)
|
reply = " ".join(list)
|
||||||
lines = textwrap.wrap(reply, 60)
|
lines = textwrap.wrap(reply, 60)
|
||||||
replies = ["Send APRS MSG of 'help' or 'help <plugin>'"]
|
replies = ["Send APRS MSG of 'help' or 'help <plugin>'"]
|
||||||
for line in lines:
|
for line in lines:
|
||||||
replies.append(f'plugins: {line}')
|
replies.append(f"plugins: {line}")
|
||||||
|
|
||||||
for entry in replies:
|
for entry in replies:
|
||||||
LOG.debug(f'{len(entry)} {entry}')
|
LOG.debug(f"{len(entry)} {entry}")
|
||||||
|
|
||||||
LOG.debug(f'{replies}')
|
LOG.debug(f"{replies}")
|
||||||
return replies
|
return replies
|
||||||
|
|
||||||
|
|
||||||
@ -342,17 +341,17 @@ class PluginManager:
|
|||||||
return cls._instance
|
return cls._instance
|
||||||
|
|
||||||
def _init(self):
|
def _init(self):
|
||||||
self._pluggy_pm = pluggy.PluginManager('aprsd')
|
self._pluggy_pm = pluggy.PluginManager("aprsd")
|
||||||
self._pluggy_pm.add_hookspecs(APRSDPluginSpec)
|
self._pluggy_pm.add_hookspecs(APRSDPluginSpec)
|
||||||
# For the watchlist plugins
|
# For the watchlist plugins
|
||||||
self._watchlist_pm = pluggy.PluginManager('aprsd')
|
self._watchlist_pm = pluggy.PluginManager("aprsd")
|
||||||
self._watchlist_pm.add_hookspecs(APRSDPluginSpec)
|
self._watchlist_pm.add_hookspecs(APRSDPluginSpec)
|
||||||
|
|
||||||
def stats(self, serializable=False) -> dict:
|
def stats(self, serializable=False) -> dict:
|
||||||
"""Collect and return stats for all plugins."""
|
"""Collect and return stats for all plugins."""
|
||||||
|
|
||||||
def full_name_with_qualname(obj):
|
def full_name_with_qualname(obj):
|
||||||
return '{}.{}'.format(
|
return "{}.{}".format(
|
||||||
obj.__class__.__module__,
|
obj.__class__.__module__,
|
||||||
obj.__class__.__qualname__,
|
obj.__class__.__qualname__,
|
||||||
)
|
)
|
||||||
@ -362,10 +361,10 @@ class PluginManager:
|
|||||||
if plugins:
|
if plugins:
|
||||||
for p in plugins:
|
for p in plugins:
|
||||||
plugin_stats[full_name_with_qualname(p)] = {
|
plugin_stats[full_name_with_qualname(p)] = {
|
||||||
'enabled': p.enabled,
|
"enabled": p.enabled,
|
||||||
'rx': p.rx_count,
|
"rx": p.rx_count,
|
||||||
'tx': p.tx_count,
|
"tx": p.tx_count,
|
||||||
'version': p.version,
|
"version": p.version,
|
||||||
}
|
}
|
||||||
|
|
||||||
return plugin_stats
|
return plugin_stats
|
||||||
@ -393,19 +392,19 @@ class PluginManager:
|
|||||||
module_name = None
|
module_name = None
|
||||||
class_name = None
|
class_name = None
|
||||||
try:
|
try:
|
||||||
module_name, class_name = module_class_string.rsplit('.', 1)
|
module_name, class_name = module_class_string.rsplit(".", 1)
|
||||||
module = importlib.import_module(module_name)
|
module = importlib.import_module(module_name)
|
||||||
# Commented out because the email thread starts in a different context
|
# Commented out because the email thread starts in a different context
|
||||||
# and hence gives a different singleton for the EmailStats
|
# and hence gives a different singleton for the EmailStats
|
||||||
# module = importlib.reload(module)
|
# module = importlib.reload(module)
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
if not module_name:
|
if not module_name:
|
||||||
LOG.error(f'Failed to load Plugin {module_class_string}')
|
LOG.error(f"Failed to load Plugin {module_class_string}")
|
||||||
else:
|
else:
|
||||||
LOG.error(f"Failed to load Plugin '{module_name}' : '{ex}'")
|
LOG.error(f"Failed to load Plugin '{module_name}' : '{ex}'")
|
||||||
return
|
return
|
||||||
|
|
||||||
assert hasattr(module, class_name), 'class {} is not in {}'.format(
|
assert hasattr(module, class_name), "class {} is not in {}".format(
|
||||||
class_name,
|
class_name,
|
||||||
module_name,
|
module_name,
|
||||||
)
|
)
|
||||||
@ -413,7 +412,7 @@ class PluginManager:
|
|||||||
# class_name, module_name))
|
# class_name, module_name))
|
||||||
cls = getattr(module, class_name)
|
cls = getattr(module, class_name)
|
||||||
if super_cls is not None:
|
if super_cls is not None:
|
||||||
assert issubclass(cls, super_cls), 'class {} should inherit from {}'.format(
|
assert issubclass(cls, super_cls), "class {} should inherit from {}".format(
|
||||||
class_name,
|
class_name,
|
||||||
super_cls.__name__,
|
super_cls.__name__,
|
||||||
)
|
)
|
||||||
@ -445,7 +444,7 @@ class PluginManager:
|
|||||||
self._watchlist_pm.register(plugin_obj)
|
self._watchlist_pm.register(plugin_obj)
|
||||||
else:
|
else:
|
||||||
LOG.warning(
|
LOG.warning(
|
||||||
f'Plugin {plugin_obj.__class__.__name__} is disabled'
|
f"Plugin {plugin_obj.__class__.__name__} is disabled"
|
||||||
)
|
)
|
||||||
elif isinstance(plugin_obj, APRSDRegexCommandPluginBase):
|
elif isinstance(plugin_obj, APRSDRegexCommandPluginBase):
|
||||||
if plugin_obj.enabled:
|
if plugin_obj.enabled:
|
||||||
@ -459,7 +458,7 @@ class PluginManager:
|
|||||||
self._pluggy_pm.register(plugin_obj)
|
self._pluggy_pm.register(plugin_obj)
|
||||||
else:
|
else:
|
||||||
LOG.warning(
|
LOG.warning(
|
||||||
f'Plugin {plugin_obj.__class__.__name__} is disabled'
|
f"Plugin {plugin_obj.__class__.__name__} is disabled"
|
||||||
)
|
)
|
||||||
elif isinstance(plugin_obj, APRSDPluginBase):
|
elif isinstance(plugin_obj, APRSDPluginBase):
|
||||||
if plugin_obj.enabled:
|
if plugin_obj.enabled:
|
||||||
@ -472,7 +471,7 @@ class PluginManager:
|
|||||||
self._pluggy_pm.register(plugin_obj)
|
self._pluggy_pm.register(plugin_obj)
|
||||||
else:
|
else:
|
||||||
LOG.warning(
|
LOG.warning(
|
||||||
f'Plugin {plugin_obj.__class__.__name__} is disabled'
|
f"Plugin {plugin_obj.__class__.__name__} is disabled"
|
||||||
)
|
)
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
LOG.error(f"Couldn't load plugin '{plugin_name}'")
|
LOG.error(f"Couldn't load plugin '{plugin_name}'")
|
||||||
@ -486,11 +485,11 @@ class PluginManager:
|
|||||||
def setup_plugins(
|
def setup_plugins(
|
||||||
self,
|
self,
|
||||||
load_help_plugin=True,
|
load_help_plugin=True,
|
||||||
plugin_list=None,
|
plugin_list=[],
|
||||||
):
|
):
|
||||||
"""Create the plugin manager and register plugins."""
|
"""Create the plugin manager and register plugins."""
|
||||||
|
|
||||||
LOG.info('Loading APRSD Plugins')
|
LOG.info("Loading APRSD Plugins")
|
||||||
# Help plugin is always enabled.
|
# Help plugin is always enabled.
|
||||||
if load_help_plugin:
|
if load_help_plugin:
|
||||||
_help = HelpPlugin()
|
_help = HelpPlugin()
|
||||||
@ -510,7 +509,7 @@ class PluginManager:
|
|||||||
for p_name in CORE_MESSAGE_PLUGINS:
|
for p_name in CORE_MESSAGE_PLUGINS:
|
||||||
self._load_plugin(p_name)
|
self._load_plugin(p_name)
|
||||||
|
|
||||||
LOG.info('Completed Plugin Loading.')
|
LOG.info("Completed Plugin Loading.")
|
||||||
|
|
||||||
def run(self, packet: packets.MessagePacket):
|
def run(self, packet: packets.MessagePacket):
|
||||||
"""Execute all the plugins run method."""
|
"""Execute all the plugins run method."""
|
||||||
@ -525,7 +524,7 @@ class PluginManager:
|
|||||||
"""Stop all threads created by all plugins."""
|
"""Stop all threads created by all plugins."""
|
||||||
with self.lock:
|
with self.lock:
|
||||||
for p in self.get_plugins():
|
for p in self.get_plugins():
|
||||||
if hasattr(p, 'stop_threads'):
|
if hasattr(p, "stop_threads"):
|
||||||
p.stop_threads()
|
p.stop_threads()
|
||||||
|
|
||||||
def register_msg(self, obj):
|
def register_msg(self, obj):
|
||||||
|
@ -4,20 +4,21 @@ import logging
|
|||||||
|
|
||||||
import requests
|
import requests
|
||||||
|
|
||||||
LOG = logging.getLogger('APRSD')
|
|
||||||
|
LOG = logging.getLogger("APRSD")
|
||||||
|
|
||||||
|
|
||||||
def get_aprs_fi(api_key, callsign):
|
def get_aprs_fi(api_key, callsign):
|
||||||
LOG.debug(f"Fetch aprs.fi location for '{callsign}'")
|
LOG.debug(f"Fetch aprs.fi location for '{callsign}'")
|
||||||
try:
|
try:
|
||||||
url = (
|
url = (
|
||||||
'http://api.aprs.fi/api/get?&what=loc&apikey={}&format=json&name={}'.format(
|
"http://api.aprs.fi/api/get?"
|
||||||
api_key, callsign
|
"&what=loc&apikey={}&format=json"
|
||||||
)
|
"&name={}".format(api_key, callsign)
|
||||||
)
|
)
|
||||||
response = requests.get(url)
|
response = requests.get(url)
|
||||||
except Exception as e:
|
except Exception:
|
||||||
raise Exception('Failed to get aprs.fi location') from e
|
raise Exception("Failed to get aprs.fi location")
|
||||||
else:
|
else:
|
||||||
response.raise_for_status()
|
response.raise_for_status()
|
||||||
return json.loads(response.text)
|
return json.loads(response.text)
|
||||||
@ -25,22 +26,22 @@ def get_aprs_fi(api_key, callsign):
|
|||||||
|
|
||||||
def get_weather_gov_for_gps(lat, lon):
|
def get_weather_gov_for_gps(lat, lon):
|
||||||
# FIXME(hemna) This is currently BROKEN
|
# FIXME(hemna) This is currently BROKEN
|
||||||
LOG.debug(f'Fetch station at {lat}, {lon}')
|
LOG.debug(f"Fetch station at {lat}, {lon}")
|
||||||
headers = requests.utils.default_headers()
|
headers = requests.utils.default_headers()
|
||||||
headers.update(
|
headers.update(
|
||||||
{'User-Agent': '(aprsd, waboring@hemna.com)'},
|
{"User-Agent": "(aprsd, waboring@hemna.com)"},
|
||||||
)
|
)
|
||||||
try:
|
try:
|
||||||
url2 = (
|
url2 = (
|
||||||
'https://forecast.weather.gov/MapClick.php?lat=%s'
|
"https://forecast.weather.gov/MapClick.php?lat=%s"
|
||||||
'&lon=%s&FcstType=json' % (lat, lon)
|
"&lon=%s&FcstType=json" % (lat, lon)
|
||||||
# f"https://api.weather.gov/points/{lat},{lon}"
|
# f"https://api.weather.gov/points/{lat},{lon}"
|
||||||
)
|
)
|
||||||
LOG.debug(f"Fetching weather '{url2}'")
|
LOG.debug(f"Fetching weather '{url2}'")
|
||||||
response = requests.get(url2, headers=headers)
|
response = requests.get(url2, headers=headers)
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
LOG.error(e)
|
LOG.error(e)
|
||||||
raise Exception('Failed to get weather') from e
|
raise Exception("Failed to get weather")
|
||||||
else:
|
else:
|
||||||
response.raise_for_status()
|
response.raise_for_status()
|
||||||
return json.loads(response.text)
|
return json.loads(response.text)
|
||||||
@ -49,24 +50,24 @@ def get_weather_gov_for_gps(lat, lon):
|
|||||||
def get_weather_gov_metar(station):
|
def get_weather_gov_metar(station):
|
||||||
LOG.debug(f"Fetch metar for station '{station}'")
|
LOG.debug(f"Fetch metar for station '{station}'")
|
||||||
try:
|
try:
|
||||||
url = 'https://api.weather.gov/stations/{}/observations/latest'.format(
|
url = "https://api.weather.gov/stations/{}/observations/latest".format(
|
||||||
station,
|
station,
|
||||||
)
|
)
|
||||||
response = requests.get(url)
|
response = requests.get(url)
|
||||||
except Exception as e:
|
except Exception:
|
||||||
raise Exception('Failed to fetch metar') from e
|
raise Exception("Failed to fetch metar")
|
||||||
else:
|
else:
|
||||||
response.raise_for_status()
|
response.raise_for_status()
|
||||||
return json.loads(response)
|
return json.loads(response)
|
||||||
|
|
||||||
|
|
||||||
def fetch_openweathermap(api_key, lat, lon, units='metric', exclude=None):
|
def fetch_openweathermap(api_key, lat, lon, units="metric", exclude=None):
|
||||||
LOG.debug(f'Fetch openweathermap for {lat}, {lon}')
|
LOG.debug(f"Fetch openweathermap for {lat}, {lon}")
|
||||||
if not exclude:
|
if not exclude:
|
||||||
exclude = 'minutely,hourly,daily,alerts'
|
exclude = "minutely,hourly,daily,alerts"
|
||||||
try:
|
try:
|
||||||
url = (
|
url = (
|
||||||
"https://api.openweathermap.org/data/3.0/onecall?"
|
"https://api.openweathermap.org/data/2.5/onecall?"
|
||||||
"lat={}&lon={}&appid={}&units={}&exclude={}".format(
|
"lat={}&lon={}&appid={}&units={}&exclude={}".format(
|
||||||
lat,
|
lat,
|
||||||
lon,
|
lon,
|
||||||
@ -79,7 +80,7 @@ def fetch_openweathermap(api_key, lat, lon, units='metric', exclude=None):
|
|||||||
response = requests.get(url)
|
response = requests.get(url)
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
LOG.error(e)
|
LOG.error(e)
|
||||||
raise Exception('Failed to get weather') from e
|
raise Exception("Failed to get weather")
|
||||||
else:
|
else:
|
||||||
response.raise_for_status()
|
response.raise_for_status()
|
||||||
return json.loads(response.text)
|
return json.loads(response.text)
|
||||||
|
@ -9,7 +9,7 @@ from aprsd import plugin, plugin_utils
|
|||||||
from aprsd.utils import trace
|
from aprsd.utils import trace
|
||||||
|
|
||||||
CONF = cfg.CONF
|
CONF = cfg.CONF
|
||||||
LOG = logging.getLogger('APRSD')
|
LOG = logging.getLogger("APRSD")
|
||||||
|
|
||||||
|
|
||||||
class USWeatherPlugin(plugin.APRSDRegexCommandPluginBase, plugin.APRSFIKEYMixin):
|
class USWeatherPlugin(plugin.APRSDRegexCommandPluginBase, plugin.APRSFIKEYMixin):
|
||||||
@ -26,22 +26,22 @@ class USWeatherPlugin(plugin.APRSDRegexCommandPluginBase, plugin.APRSFIKEYMixin)
|
|||||||
"""
|
"""
|
||||||
|
|
||||||
# command_regex = r"^([w][x]|[w][x]\s|weather)"
|
# command_regex = r"^([w][x]|[w][x]\s|weather)"
|
||||||
command_regex = r'^[wW]'
|
command_regex = r"^[wW]"
|
||||||
|
|
||||||
command_name = 'USWeather'
|
command_name = "USWeather"
|
||||||
short_description = 'Provide USA only weather of GPS Beacon location'
|
short_description = "Provide USA only weather of GPS Beacon location"
|
||||||
|
|
||||||
def setup(self):
|
def setup(self):
|
||||||
self.ensure_aprs_fi_key()
|
self.ensure_aprs_fi_key()
|
||||||
|
|
||||||
@trace.trace
|
@trace.trace
|
||||||
def process(self, packet):
|
def process(self, packet):
|
||||||
LOG.info('Weather Plugin')
|
LOG.info("Weather Plugin")
|
||||||
fromcall = packet.from_call
|
fromcall = packet.from_call
|
||||||
message = packet.get('message_text', None)
|
message = packet.get("message_text", None)
|
||||||
# message = packet.get("message_text", None)
|
# message = packet.get("message_text", None)
|
||||||
# ack = packet.get("msgNo", "0")
|
# ack = packet.get("msgNo", "0")
|
||||||
a = re.search(r'^.*\s+(.*)', message)
|
a = re.search(r"^.*\s+(.*)", message)
|
||||||
if a is not None:
|
if a is not None:
|
||||||
searchcall = a.group(1)
|
searchcall = a.group(1)
|
||||||
searchcall = searchcall.upper()
|
searchcall = searchcall.upper()
|
||||||
@ -51,34 +51,34 @@ class USWeatherPlugin(plugin.APRSDRegexCommandPluginBase, plugin.APRSFIKEYMixin)
|
|||||||
try:
|
try:
|
||||||
aprs_data = plugin_utils.get_aprs_fi(api_key, searchcall)
|
aprs_data = plugin_utils.get_aprs_fi(api_key, searchcall)
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
LOG.error(f'Failed to fetch aprs.fi data {ex}')
|
LOG.error(f"Failed to fetch aprs.fi data {ex}")
|
||||||
return 'Failed to fetch aprs.fi location'
|
return "Failed to fetch aprs.fi location"
|
||||||
|
|
||||||
LOG.debug(f'LocationPlugin: aprs_data = {aprs_data}')
|
LOG.debug(f"LocationPlugin: aprs_data = {aprs_data}")
|
||||||
if not len(aprs_data['entries']):
|
if not len(aprs_data["entries"]):
|
||||||
LOG.error("Didn't get any entries from aprs.fi")
|
LOG.error("Didn't get any entries from aprs.fi")
|
||||||
return 'Failed to fetch aprs.fi location'
|
return "Failed to fetch aprs.fi location"
|
||||||
|
|
||||||
lat = aprs_data['entries'][0]['lat']
|
lat = aprs_data["entries"][0]["lat"]
|
||||||
lon = aprs_data['entries'][0]['lng']
|
lon = aprs_data["entries"][0]["lng"]
|
||||||
|
|
||||||
try:
|
try:
|
||||||
wx_data = plugin_utils.get_weather_gov_for_gps(lat, lon)
|
wx_data = plugin_utils.get_weather_gov_for_gps(lat, lon)
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
LOG.error(f"Couldn't fetch forecast.weather.gov '{ex}'")
|
LOG.error(f"Couldn't fetch forecast.weather.gov '{ex}'")
|
||||||
return 'Unable to get weather'
|
return "Unable to get weather"
|
||||||
|
|
||||||
LOG.info(f'WX data {wx_data}')
|
LOG.info(f"WX data {wx_data}")
|
||||||
|
|
||||||
reply = (
|
reply = (
|
||||||
'%sF(%sF/%sF) %s. %s, %s.'
|
"%sF(%sF/%sF) %s. %s, %s."
|
||||||
% (
|
% (
|
||||||
wx_data['currentobservation']['Temp'],
|
wx_data["currentobservation"]["Temp"],
|
||||||
wx_data['data']['temperature'][0],
|
wx_data["data"]["temperature"][0],
|
||||||
wx_data['data']['temperature'][1],
|
wx_data["data"]["temperature"][1],
|
||||||
wx_data['data']['weather'][0],
|
wx_data["data"]["weather"][0],
|
||||||
wx_data['time']['startPeriodName'][1],
|
wx_data["time"]["startPeriodName"][1],
|
||||||
wx_data['data']['weather'][1],
|
wx_data["data"]["weather"][1],
|
||||||
)
|
)
|
||||||
).rstrip()
|
).rstrip()
|
||||||
LOG.debug(f"reply: '{reply}' ")
|
LOG.debug(f"reply: '{reply}' ")
|
||||||
@ -100,31 +100,31 @@ class USMetarPlugin(plugin.APRSDRegexCommandPluginBase, plugin.APRSFIKEYMixin):
|
|||||||
|
|
||||||
"""
|
"""
|
||||||
|
|
||||||
command_regex = r'^([m]|[M]|[m]\s|metar)'
|
command_regex = r"^([m]|[M]|[m]\s|metar)"
|
||||||
command_name = 'USMetar'
|
command_name = "USMetar"
|
||||||
short_description = 'USA only METAR of GPS Beacon location'
|
short_description = "USA only METAR of GPS Beacon location"
|
||||||
|
|
||||||
def setup(self):
|
def setup(self):
|
||||||
self.ensure_aprs_fi_key()
|
self.ensure_aprs_fi_key()
|
||||||
|
|
||||||
@trace.trace
|
@trace.trace
|
||||||
def process(self, packet):
|
def process(self, packet):
|
||||||
fromcall = packet.get('from')
|
fromcall = packet.get("from")
|
||||||
message = packet.get('message_text', None)
|
message = packet.get("message_text", None)
|
||||||
# ack = packet.get("msgNo", "0")
|
# ack = packet.get("msgNo", "0")
|
||||||
LOG.info(f"WX Plugin '{message}'")
|
LOG.info(f"WX Plugin '{message}'")
|
||||||
a = re.search(r'^.*\s+(.*)', message)
|
a = re.search(r"^.*\s+(.*)", message)
|
||||||
if a is not None:
|
if a is not None:
|
||||||
searchcall = a.group(1)
|
searchcall = a.group(1)
|
||||||
station = searchcall.upper()
|
station = searchcall.upper()
|
||||||
try:
|
try:
|
||||||
resp = plugin_utils.get_weather_gov_metar(station)
|
resp = plugin_utils.get_weather_gov_metar(station)
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
LOG.debug(f'Weather failed with: {str(e)}')
|
LOG.debug(f"Weather failed with: {str(e)}")
|
||||||
reply = 'Unable to find station METAR'
|
reply = "Unable to find station METAR"
|
||||||
else:
|
else:
|
||||||
station_data = json.loads(resp.text)
|
station_data = json.loads(resp.text)
|
||||||
reply = station_data['properties']['rawMessage']
|
reply = station_data["properties"]["rawMessage"]
|
||||||
|
|
||||||
return reply
|
return reply
|
||||||
else:
|
else:
|
||||||
@ -136,36 +136,36 @@ class USMetarPlugin(plugin.APRSDRegexCommandPluginBase, plugin.APRSFIKEYMixin):
|
|||||||
try:
|
try:
|
||||||
aprs_data = plugin_utils.get_aprs_fi(api_key, fromcall)
|
aprs_data = plugin_utils.get_aprs_fi(api_key, fromcall)
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
LOG.error(f'Failed to fetch aprs.fi data {ex}')
|
LOG.error(f"Failed to fetch aprs.fi data {ex}")
|
||||||
return 'Failed to fetch aprs.fi location'
|
return "Failed to fetch aprs.fi location"
|
||||||
|
|
||||||
# LOG.debug("LocationPlugin: aprs_data = {}".format(aprs_data))
|
# LOG.debug("LocationPlugin: aprs_data = {}".format(aprs_data))
|
||||||
if not len(aprs_data['entries']):
|
if not len(aprs_data["entries"]):
|
||||||
LOG.error('Found no entries from aprs.fi!')
|
LOG.error("Found no entries from aprs.fi!")
|
||||||
return 'Failed to fetch aprs.fi location'
|
return "Failed to fetch aprs.fi location"
|
||||||
|
|
||||||
lat = aprs_data['entries'][0]['lat']
|
lat = aprs_data["entries"][0]["lat"]
|
||||||
lon = aprs_data['entries'][0]['lng']
|
lon = aprs_data["entries"][0]["lng"]
|
||||||
|
|
||||||
try:
|
try:
|
||||||
wx_data = plugin_utils.get_weather_gov_for_gps(lat, lon)
|
wx_data = plugin_utils.get_weather_gov_for_gps(lat, lon)
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
LOG.error(f"Couldn't fetch forecast.weather.gov '{ex}'")
|
LOG.error(f"Couldn't fetch forecast.weather.gov '{ex}'")
|
||||||
return 'Unable to metar find station.'
|
return "Unable to metar find station."
|
||||||
|
|
||||||
if wx_data['location']['metar']:
|
if wx_data["location"]["metar"]:
|
||||||
station = wx_data['location']['metar']
|
station = wx_data["location"]["metar"]
|
||||||
try:
|
try:
|
||||||
resp = plugin_utils.get_weather_gov_metar(station)
|
resp = plugin_utils.get_weather_gov_metar(station)
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
LOG.debug(f'Weather failed with: {str(e)}')
|
LOG.debug(f"Weather failed with: {str(e)}")
|
||||||
reply = 'Failed to get Metar'
|
reply = "Failed to get Metar"
|
||||||
else:
|
else:
|
||||||
station_data = json.loads(resp.text)
|
station_data = json.loads(resp.text)
|
||||||
reply = station_data['properties']['rawMessage']
|
reply = station_data["properties"]["rawMessage"]
|
||||||
else:
|
else:
|
||||||
# Couldn't find a station
|
# Couldn't find a station
|
||||||
reply = 'No Metar station found'
|
reply = "No Metar station found"
|
||||||
|
|
||||||
return reply
|
return reply
|
||||||
|
|
||||||
@ -190,36 +190,35 @@ class OWMWeatherPlugin(plugin.APRSDRegexCommandPluginBase):
|
|||||||
"""
|
"""
|
||||||
|
|
||||||
# command_regex = r"^([w][x]|[w][x]\s|weather)"
|
# command_regex = r"^([w][x]|[w][x]\s|weather)"
|
||||||
command_regex = r'^[wW]'
|
command_regex = r"^[wW]"
|
||||||
|
|
||||||
command_name = 'OpenWeatherMap'
|
command_name = "OpenWeatherMap"
|
||||||
short_description = 'OpenWeatherMap weather of GPS Beacon location'
|
short_description = "OpenWeatherMap weather of GPS Beacon location"
|
||||||
|
|
||||||
def setup(self):
|
def setup(self):
|
||||||
if not CONF.owm_weather_plugin.apiKey:
|
if not CONF.owm_weather_plugin.apiKey:
|
||||||
LOG.error('Config.owm_weather_plugin.apiKey is not set. Disabling')
|
LOG.error("Config.owm_weather_plugin.apiKey is not set. Disabling")
|
||||||
self.enabled = False
|
self.enabled = False
|
||||||
else:
|
else:
|
||||||
self.enabled = True
|
self.enabled = True
|
||||||
|
|
||||||
def help(self):
|
def help(self):
|
||||||
_help = [
|
_help = [
|
||||||
'openweathermap: Send {} to get weather from your location'.format(
|
"openweathermap: Send {} to get weather " "from your location".format(
|
||||||
self.command_regex
|
|
||||||
),
|
|
||||||
'openweathermap: Send {} <callsign> to get weather from <callsign>'.format(
|
|
||||||
self.command_regex
|
self.command_regex
|
||||||
),
|
),
|
||||||
|
"openweathermap: Send {} <callsign> to get "
|
||||||
|
"weather from <callsign>".format(self.command_regex),
|
||||||
]
|
]
|
||||||
return _help
|
return _help
|
||||||
|
|
||||||
@trace.trace
|
@trace.trace
|
||||||
def process(self, packet):
|
def process(self, packet):
|
||||||
fromcall = packet.get('from_call')
|
fromcall = packet.get("from_call")
|
||||||
message = packet.get('message_text', None)
|
message = packet.get("message_text", None)
|
||||||
# ack = packet.get("msgNo", "0")
|
# ack = packet.get("msgNo", "0")
|
||||||
LOG.info(f"OWMWeather Plugin '{message}'")
|
LOG.info(f"OWMWeather Plugin '{message}'")
|
||||||
a = re.search(r'^.*\s+(.*)', message)
|
a = re.search(r"^.*\s+(.*)", message)
|
||||||
if a is not None:
|
if a is not None:
|
||||||
searchcall = a.group(1)
|
searchcall = a.group(1)
|
||||||
searchcall = searchcall.upper()
|
searchcall = searchcall.upper()
|
||||||
@ -231,16 +230,16 @@ class OWMWeatherPlugin(plugin.APRSDRegexCommandPluginBase):
|
|||||||
try:
|
try:
|
||||||
aprs_data = plugin_utils.get_aprs_fi(api_key, searchcall)
|
aprs_data = plugin_utils.get_aprs_fi(api_key, searchcall)
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
LOG.error(f'Failed to fetch aprs.fi data {ex}')
|
LOG.error(f"Failed to fetch aprs.fi data {ex}")
|
||||||
return 'Failed to fetch location'
|
return "Failed to fetch location"
|
||||||
|
|
||||||
# LOG.debug("LocationPlugin: aprs_data = {}".format(aprs_data))
|
# LOG.debug("LocationPlugin: aprs_data = {}".format(aprs_data))
|
||||||
if not len(aprs_data['entries']):
|
if not len(aprs_data["entries"]):
|
||||||
LOG.error('Found no entries from aprs.fi!')
|
LOG.error("Found no entries from aprs.fi!")
|
||||||
return 'Failed to fetch location'
|
return "Failed to fetch location"
|
||||||
|
|
||||||
lat = aprs_data['entries'][0]['lat']
|
lat = aprs_data["entries"][0]["lat"]
|
||||||
lon = aprs_data['entries'][0]['lng']
|
lon = aprs_data["entries"][0]["lng"]
|
||||||
|
|
||||||
units = CONF.units
|
units = CONF.units
|
||||||
api_key = CONF.owm_weather_plugin.apiKey
|
api_key = CONF.owm_weather_plugin.apiKey
|
||||||
@ -250,40 +249,40 @@ class OWMWeatherPlugin(plugin.APRSDRegexCommandPluginBase):
|
|||||||
lat,
|
lat,
|
||||||
lon,
|
lon,
|
||||||
units=units,
|
units=units,
|
||||||
exclude='minutely,hourly',
|
exclude="minutely,hourly",
|
||||||
)
|
)
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
LOG.error(f"Couldn't fetch openweathermap api '{ex}'")
|
LOG.error(f"Couldn't fetch openweathermap api '{ex}'")
|
||||||
# default to UTC
|
# default to UTC
|
||||||
return 'Unable to get weather'
|
return "Unable to get weather"
|
||||||
|
|
||||||
if units == 'metric':
|
if units == "metric":
|
||||||
degree = 'C'
|
degree = "C"
|
||||||
else:
|
else:
|
||||||
degree = 'F'
|
degree = "F"
|
||||||
|
|
||||||
if 'wind_gust' in wx_data['current']:
|
if "wind_gust" in wx_data["current"]:
|
||||||
wind = '{:.0f}@{}G{:.0f}'.format(
|
wind = "{:.0f}@{}G{:.0f}".format(
|
||||||
wx_data['current']['wind_speed'],
|
wx_data["current"]["wind_speed"],
|
||||||
wx_data['current']['wind_deg'],
|
wx_data["current"]["wind_deg"],
|
||||||
wx_data['current']['wind_gust'],
|
wx_data["current"]["wind_gust"],
|
||||||
)
|
)
|
||||||
else:
|
else:
|
||||||
wind = '{:.0f}@{}'.format(
|
wind = "{:.0f}@{}".format(
|
||||||
wx_data['current']['wind_speed'],
|
wx_data["current"]["wind_speed"],
|
||||||
wx_data['current']['wind_deg'],
|
wx_data["current"]["wind_deg"],
|
||||||
)
|
)
|
||||||
|
|
||||||
# LOG.debug(wx_data["current"])
|
# LOG.debug(wx_data["current"])
|
||||||
# LOG.debug(wx_data["daily"])
|
# LOG.debug(wx_data["daily"])
|
||||||
reply = '{} {:.1f}{}/{:.1f}{} Wind {} {}%'.format(
|
reply = "{} {:.1f}{}/{:.1f}{} Wind {} {}%".format(
|
||||||
wx_data['current']['weather'][0]['description'],
|
wx_data["current"]["weather"][0]["description"],
|
||||||
wx_data['current']['temp'],
|
wx_data["current"]["temp"],
|
||||||
degree,
|
degree,
|
||||||
wx_data['current']['dew_point'],
|
wx_data["current"]["dew_point"],
|
||||||
degree,
|
degree,
|
||||||
wind,
|
wind,
|
||||||
wx_data['current']['humidity'],
|
wx_data["current"]["humidity"],
|
||||||
)
|
)
|
||||||
|
|
||||||
return reply
|
return reply
|
||||||
@ -312,26 +311,26 @@ class AVWXWeatherPlugin(plugin.APRSDRegexCommandPluginBase):
|
|||||||
docker build -f Dockerfile -t avwx-api:master .
|
docker build -f Dockerfile -t avwx-api:master .
|
||||||
"""
|
"""
|
||||||
|
|
||||||
command_regex = r'^([m]|[m]|[m]\s|metar)'
|
command_regex = r"^([m]|[m]|[m]\s|metar)"
|
||||||
command_name = 'AVWXWeather'
|
command_name = "AVWXWeather"
|
||||||
short_description = 'AVWX weather of GPS Beacon location'
|
short_description = "AVWX weather of GPS Beacon location"
|
||||||
|
|
||||||
def setup(self):
|
def setup(self):
|
||||||
if not CONF.avwx_plugin.base_url:
|
if not CONF.avwx_plugin.base_url:
|
||||||
LOG.error('Config avwx_plugin.base_url not specified. Disabling')
|
LOG.error("Config avwx_plugin.base_url not specified. Disabling")
|
||||||
return False
|
return False
|
||||||
elif not CONF.avwx_plugin.apiKey:
|
elif not CONF.avwx_plugin.apiKey:
|
||||||
LOG.error('Config avwx_plugin.apiKey not specified. Disabling')
|
LOG.error("Config avwx_plugin.apiKey not specified. Disabling")
|
||||||
return False
|
return False
|
||||||
|
else:
|
||||||
self.enabled = True
|
return True
|
||||||
|
|
||||||
def help(self):
|
def help(self):
|
||||||
_help = [
|
_help = [
|
||||||
'avwxweather: Send {} to get weather from your location'.format(
|
"avwxweather: Send {} to get weather " "from your location".format(
|
||||||
self.command_regex
|
self.command_regex
|
||||||
),
|
),
|
||||||
'avwxweather: Send {} <callsign> to get weather from <callsign>'.format(
|
"avwxweather: Send {} <callsign> to get " "weather from <callsign>".format(
|
||||||
self.command_regex
|
self.command_regex
|
||||||
),
|
),
|
||||||
]
|
]
|
||||||
@ -339,11 +338,11 @@ class AVWXWeatherPlugin(plugin.APRSDRegexCommandPluginBase):
|
|||||||
|
|
||||||
@trace.trace
|
@trace.trace
|
||||||
def process(self, packet):
|
def process(self, packet):
|
||||||
fromcall = packet.get('from')
|
fromcall = packet.get("from")
|
||||||
message = packet.get('message_text', None)
|
message = packet.get("message_text", None)
|
||||||
# ack = packet.get("msgNo", "0")
|
# ack = packet.get("msgNo", "0")
|
||||||
LOG.info(f"AVWXWeather Plugin '{message}'")
|
LOG.info(f"AVWXWeather Plugin '{message}'")
|
||||||
a = re.search(r'^.*\s+(.*)', message)
|
a = re.search(r"^.*\s+(.*)", message)
|
||||||
if a is not None:
|
if a is not None:
|
||||||
searchcall = a.group(1)
|
searchcall = a.group(1)
|
||||||
searchcall = searchcall.upper()
|
searchcall = searchcall.upper()
|
||||||
@ -354,43 +353,43 @@ class AVWXWeatherPlugin(plugin.APRSDRegexCommandPluginBase):
|
|||||||
try:
|
try:
|
||||||
aprs_data = plugin_utils.get_aprs_fi(api_key, searchcall)
|
aprs_data = plugin_utils.get_aprs_fi(api_key, searchcall)
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
LOG.error(f'Failed to fetch aprs.fi data {ex}')
|
LOG.error(f"Failed to fetch aprs.fi data {ex}")
|
||||||
return 'Failed to fetch location'
|
return "Failed to fetch location"
|
||||||
|
|
||||||
# LOG.debug("LocationPlugin: aprs_data = {}".format(aprs_data))
|
# LOG.debug("LocationPlugin: aprs_data = {}".format(aprs_data))
|
||||||
if not len(aprs_data['entries']):
|
if not len(aprs_data["entries"]):
|
||||||
LOG.error('Found no entries from aprs.fi!')
|
LOG.error("Found no entries from aprs.fi!")
|
||||||
return 'Failed to fetch location'
|
return "Failed to fetch location"
|
||||||
|
|
||||||
lat = aprs_data['entries'][0]['lat']
|
lat = aprs_data["entries"][0]["lat"]
|
||||||
lon = aprs_data['entries'][0]['lng']
|
lon = aprs_data["entries"][0]["lng"]
|
||||||
|
|
||||||
api_key = CONF.avwx_plugin.apiKey
|
api_key = CONF.avwx_plugin.apiKey
|
||||||
base_url = CONF.avwx_plugin.base_url
|
base_url = CONF.avwx_plugin.base_url
|
||||||
token = f'TOKEN {api_key}'
|
token = f"TOKEN {api_key}"
|
||||||
headers = {'Authorization': token}
|
headers = {"Authorization": token}
|
||||||
try:
|
try:
|
||||||
coord = f'{lat},{lon}'
|
coord = f"{lat},{lon}"
|
||||||
url = (
|
url = (
|
||||||
'{}/api/station/near/{}?'
|
"{}/api/station/near/{}?"
|
||||||
'n=1&airport=false&reporting=true&format=json'.format(base_url, coord)
|
"n=1&airport=false&reporting=true&format=json".format(base_url, coord)
|
||||||
)
|
)
|
||||||
|
|
||||||
LOG.debug(f"Get stations near me '{url}'")
|
LOG.debug(f"Get stations near me '{url}'")
|
||||||
response = requests.get(url, headers=headers)
|
response = requests.get(url, headers=headers)
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
LOG.error(ex)
|
LOG.error(ex)
|
||||||
raise Exception(f"Failed to get the weather '{ex}'") from ex
|
raise Exception(f"Failed to get the weather '{ex}'")
|
||||||
else:
|
else:
|
||||||
wx_data = json.loads(response.text)
|
wx_data = json.loads(response.text)
|
||||||
|
|
||||||
# LOG.debug(wx_data)
|
# LOG.debug(wx_data)
|
||||||
station = wx_data[0]['station']['icao']
|
station = wx_data[0]["station"]["icao"]
|
||||||
|
|
||||||
try:
|
try:
|
||||||
url = (
|
url = (
|
||||||
'{}/api/metar/{}?options=info,translate,summary'
|
"{}/api/metar/{}?options=info,translate,summary"
|
||||||
'&airport=true&reporting=true&format=json&onfail=cache'.format(
|
"&airport=true&reporting=true&format=json&onfail=cache".format(
|
||||||
base_url,
|
base_url,
|
||||||
station,
|
station,
|
||||||
)
|
)
|
||||||
@ -400,9 +399,9 @@ class AVWXWeatherPlugin(plugin.APRSDRegexCommandPluginBase):
|
|||||||
response = requests.get(url, headers=headers)
|
response = requests.get(url, headers=headers)
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
LOG.error(ex)
|
LOG.error(ex)
|
||||||
raise Exception(f'Failed to get metar {ex}') from ex
|
raise Exception(f"Failed to get metar {ex}")
|
||||||
else:
|
else:
|
||||||
metar_data = json.loads(response.text)
|
metar_data = json.loads(response.text)
|
||||||
|
|
||||||
# LOG.debug(metar_data)
|
# LOG.debug(metar_data)
|
||||||
return metar_data['raw']
|
return metar_data["raw"]
|
||||||
|
@ -3,7 +3,7 @@ from typing import Callable, Protocol, runtime_checkable
|
|||||||
|
|
||||||
from aprsd.utils import singleton
|
from aprsd.utils import singleton
|
||||||
|
|
||||||
LOG = logging.getLogger('APRSD')
|
LOG = logging.getLogger("APRSD")
|
||||||
|
|
||||||
|
|
||||||
@runtime_checkable
|
@runtime_checkable
|
||||||
@ -31,16 +31,15 @@ class Collector:
|
|||||||
serializable=serializable
|
serializable=serializable
|
||||||
).copy()
|
).copy()
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
LOG.error(f'Error in producer {name} (stats): {e}')
|
LOG.error(f"Error in producer {name} (stats): {e}")
|
||||||
raise e
|
|
||||||
return stats
|
return stats
|
||||||
|
|
||||||
def register_producer(self, producer_name: Callable):
|
def register_producer(self, producer_name: Callable):
|
||||||
if not isinstance(producer_name, StatsProducer):
|
if not isinstance(producer_name, StatsProducer):
|
||||||
raise TypeError(f'Producer {producer_name} is not a StatsProducer')
|
raise TypeError(f"Producer {producer_name} is not a StatsProducer")
|
||||||
self.producers.append(producer_name)
|
self.producers.append(producer_name)
|
||||||
|
|
||||||
def unregister_producer(self, producer_name: Callable):
|
def unregister_producer(self, producer_name: Callable):
|
||||||
if not isinstance(producer_name, StatsProducer):
|
if not isinstance(producer_name, StatsProducer):
|
||||||
raise TypeError(f'Producer {producer_name} is not a StatsProducer')
|
raise TypeError(f"Producer {producer_name} is not a StatsProducer")
|
||||||
self.producers.remove(producer_name)
|
self.producers.remove(producer_name)
|
||||||
|
@ -4,8 +4,9 @@ import queue
|
|||||||
# aprsd.threads
|
# aprsd.threads
|
||||||
from .aprsd import APRSDThread, APRSDThreadList # noqa: F401
|
from .aprsd import APRSDThread, APRSDThreadList # noqa: F401
|
||||||
from .rx import ( # noqa: F401
|
from .rx import ( # noqa: F401
|
||||||
|
APRSDDupeRXThread,
|
||||||
APRSDProcessPacketThread,
|
APRSDProcessPacketThread,
|
||||||
APRSDRXThread,
|
APRSDRXThread,
|
||||||
)
|
)
|
||||||
|
|
||||||
packet_queue = queue.Queue(maxsize=500)
|
packet_queue = queue.Queue(maxsize=20)
|
||||||
|
@ -7,50 +7,36 @@ import aprslib
|
|||||||
from oslo_config import cfg
|
from oslo_config import cfg
|
||||||
|
|
||||||
from aprsd import packets, plugin
|
from aprsd import packets, plugin
|
||||||
from aprsd.client.client import APRSDClient
|
from aprsd.client import client_factory
|
||||||
from aprsd.packets import collector, filter
|
from aprsd.packets import collector
|
||||||
from aprsd.packets import log as packet_log
|
from aprsd.packets import log as packet_log
|
||||||
from aprsd.threads import APRSDThread, tx
|
from aprsd.threads import APRSDThread, tx
|
||||||
|
from aprsd.utils import trace
|
||||||
|
|
||||||
CONF = cfg.CONF
|
CONF = cfg.CONF
|
||||||
LOG = logging.getLogger('APRSD')
|
LOG = logging.getLogger("APRSD")
|
||||||
|
|
||||||
|
|
||||||
class APRSDRXThread(APRSDThread):
|
class APRSDRXThread(APRSDThread):
|
||||||
"""Main Class to connect to an APRS Client and recieve packets.
|
|
||||||
|
|
||||||
A packet is received in the main loop and then sent to the
|
|
||||||
process_packet method, which sends the packet through the collector
|
|
||||||
to track the packet for stats, and then put into the packet queue
|
|
||||||
for processing in a separate thread.
|
|
||||||
"""
|
|
||||||
|
|
||||||
_client = None
|
_client = None
|
||||||
|
|
||||||
# This is the queue that packets are sent to for processing.
|
|
||||||
# We process packets in a separate thread to help prevent
|
|
||||||
# getting blocked by the APRS server trying to send us packets.
|
|
||||||
packet_queue = None
|
|
||||||
|
|
||||||
pkt_count = 0
|
|
||||||
|
|
||||||
def __init__(self, packet_queue):
|
def __init__(self, packet_queue):
|
||||||
super().__init__('RX_PKT')
|
super().__init__("RX_PKT")
|
||||||
self.packet_queue = packet_queue
|
self.packet_queue = packet_queue
|
||||||
|
|
||||||
def stop(self):
|
def stop(self):
|
||||||
self.thread_stop = True
|
self.thread_stop = True
|
||||||
if self._client:
|
if self._client:
|
||||||
self._client.close()
|
self._client.stop()
|
||||||
|
|
||||||
def loop(self):
|
def loop(self):
|
||||||
if not self._client:
|
if not self._client:
|
||||||
self._client = APRSDClient()
|
self._client = client_factory.create()
|
||||||
time.sleep(1)
|
time.sleep(1)
|
||||||
return True
|
return True
|
||||||
|
|
||||||
if not self._client.is_alive:
|
if not self._client.is_connected:
|
||||||
self._client = APRSDClient()
|
self._client = client_factory.create()
|
||||||
time.sleep(1)
|
time.sleep(1)
|
||||||
return True
|
return True
|
||||||
|
|
||||||
@ -66,35 +52,62 @@ class APRSDRXThread(APRSDThread):
|
|||||||
# kwargs. :(
|
# kwargs. :(
|
||||||
# https://github.com/rossengeorgiev/aprs-python/pull/56
|
# https://github.com/rossengeorgiev/aprs-python/pull/56
|
||||||
self._client.consumer(
|
self._client.consumer(
|
||||||
self.process_packet,
|
self._process_packet,
|
||||||
raw=False,
|
raw=False,
|
||||||
|
blocking=False,
|
||||||
)
|
)
|
||||||
except (
|
except (
|
||||||
aprslib.exceptions.ConnectionDrop,
|
aprslib.exceptions.ConnectionDrop,
|
||||||
aprslib.exceptions.ConnectionError,
|
aprslib.exceptions.ConnectionError,
|
||||||
):
|
):
|
||||||
LOG.error('Connection dropped, reconnecting')
|
LOG.error("Connection dropped, reconnecting")
|
||||||
# Force the deletion of the client object connected to aprs
|
# Force the deletion of the client object connected to aprs
|
||||||
# This will cause a reconnect, next time client.get_client()
|
# This will cause a reconnect, next time client.get_client()
|
||||||
# is called
|
# is called
|
||||||
self._client.reset()
|
self._client.reset()
|
||||||
time.sleep(5)
|
time.sleep(5)
|
||||||
except Exception as ex:
|
except Exception:
|
||||||
LOG.exception(ex)
|
# LOG.exception(ex)
|
||||||
LOG.error('Resetting connection and trying again.')
|
LOG.error("Resetting connection and trying again.")
|
||||||
self._client.reset()
|
self._client.reset()
|
||||||
time.sleep(5)
|
time.sleep(5)
|
||||||
|
# Continue to loop
|
||||||
|
time.sleep(1)
|
||||||
return True
|
return True
|
||||||
|
|
||||||
|
def _process_packet(self, *args, **kwargs):
|
||||||
|
"""Intermediate callback so we can update the keepalive time."""
|
||||||
|
# Now call the 'real' packet processing for a RX'x packet
|
||||||
|
self.process_packet(*args, **kwargs)
|
||||||
|
|
||||||
|
@abc.abstractmethod
|
||||||
def process_packet(self, *args, **kwargs):
|
def process_packet(self, *args, **kwargs):
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
class APRSDDupeRXThread(APRSDRXThread):
|
||||||
|
"""Process received packets.
|
||||||
|
|
||||||
|
This is the main APRSD Server command thread that
|
||||||
|
receives packets and makes sure the packet
|
||||||
|
hasn't been seen previously before sending it on
|
||||||
|
to be processed.
|
||||||
|
"""
|
||||||
|
|
||||||
|
@trace.trace
|
||||||
|
def process_packet(self, *args, **kwargs):
|
||||||
|
"""This handles the processing of an inbound packet.
|
||||||
|
|
||||||
|
When a packet is received by the connected client object,
|
||||||
|
it sends the raw packet into this function. This function then
|
||||||
|
decodes the packet via the client, and then processes the packet.
|
||||||
|
Ack Packets are sent to the PluginProcessPacketThread for processing.
|
||||||
|
All other packets have to be checked as a dupe, and then only after
|
||||||
|
we haven't seen this packet before, do we send it to the
|
||||||
|
PluginProcessPacketThread for processing.
|
||||||
|
"""
|
||||||
packet = self._client.decode_packet(*args, **kwargs)
|
packet = self._client.decode_packet(*args, **kwargs)
|
||||||
if not packet:
|
packet_log.log(packet)
|
||||||
LOG.error(
|
|
||||||
'No packet received from decode_packet. Most likely a failure to parse'
|
|
||||||
)
|
|
||||||
return
|
|
||||||
self.pkt_count += 1
|
|
||||||
packet_log.log(packet, packet_count=self.pkt_count)
|
|
||||||
pkt_list = packets.PacketList()
|
pkt_list = packets.PacketList()
|
||||||
|
|
||||||
if isinstance(packet, packets.AckPacket):
|
if isinstance(packet, packets.AckPacket):
|
||||||
@ -127,55 +140,26 @@ class APRSDRXThread(APRSDThread):
|
|||||||
# If the packet came in within N seconds of the
|
# If the packet came in within N seconds of the
|
||||||
# Last time seeing the packet, then we drop it as a dupe.
|
# Last time seeing the packet, then we drop it as a dupe.
|
||||||
LOG.warning(
|
LOG.warning(
|
||||||
f'Packet {packet.from_call}:{packet.msgNo} already tracked, dropping.'
|
f"Packet {packet.from_call}:{packet.msgNo} already tracked, dropping."
|
||||||
)
|
)
|
||||||
else:
|
else:
|
||||||
LOG.warning(
|
LOG.warning(
|
||||||
f'Packet {packet.from_call}:{packet.msgNo} already tracked '
|
f"Packet {packet.from_call}:{packet.msgNo} already tracked "
|
||||||
f'but older than {CONF.packet_dupe_timeout} seconds. processing.',
|
f"but older than {CONF.packet_dupe_timeout} seconds. processing.",
|
||||||
)
|
)
|
||||||
collector.PacketCollector().rx(packet)
|
collector.PacketCollector().rx(packet)
|
||||||
self.packet_queue.put(packet)
|
self.packet_queue.put(packet)
|
||||||
|
|
||||||
|
|
||||||
class APRSDFilterThread(APRSDThread):
|
class APRSDPluginRXThread(APRSDDupeRXThread):
|
||||||
def __init__(self, thread_name, packet_queue):
|
""" "Process received packets.
|
||||||
super().__init__(thread_name)
|
|
||||||
self.packet_queue = packet_queue
|
|
||||||
|
|
||||||
def filter_packet(self, packet):
|
For backwards compatibility, we keep the APRSDPluginRXThread.
|
||||||
# Do any packet filtering prior to processing
|
"""
|
||||||
if not filter.PacketFilter().filter(packet):
|
|
||||||
return None
|
|
||||||
return packet
|
|
||||||
|
|
||||||
def print_packet(self, packet):
|
|
||||||
"""Allow a child of this class to override this.
|
|
||||||
|
|
||||||
This is helpful if for whatever reason the child class
|
|
||||||
doesn't want to log packets.
|
|
||||||
|
|
||||||
"""
|
|
||||||
packet_log.log(packet)
|
|
||||||
|
|
||||||
def loop(self):
|
|
||||||
try:
|
|
||||||
packet = self.packet_queue.get(timeout=1)
|
|
||||||
self.print_packet(packet)
|
|
||||||
if packet:
|
|
||||||
if self.filter_packet(packet):
|
|
||||||
self.process_packet(packet)
|
|
||||||
except queue.Empty:
|
|
||||||
pass
|
|
||||||
return True
|
|
||||||
|
|
||||||
|
|
||||||
class APRSDProcessPacketThread(APRSDFilterThread):
|
class APRSDProcessPacketThread(APRSDThread):
|
||||||
"""Base class for processing received packets after they have been filtered.
|
"""Base class for processing received packets.
|
||||||
|
|
||||||
Packets are received from the client, then filtered for dupes,
|
|
||||||
then sent to the packet queue. This thread pulls packets from
|
|
||||||
the packet queue for processing.
|
|
||||||
|
|
||||||
This is the base class for processing packets coming from
|
This is the base class for processing packets coming from
|
||||||
the consumer. This base class handles sending ack packets and
|
the consumer. This base class handles sending ack packets and
|
||||||
@ -183,42 +167,48 @@ class APRSDProcessPacketThread(APRSDFilterThread):
|
|||||||
for processing."""
|
for processing."""
|
||||||
|
|
||||||
def __init__(self, packet_queue):
|
def __init__(self, packet_queue):
|
||||||
super().__init__('ProcessPKT', packet_queue=packet_queue)
|
self.packet_queue = packet_queue
|
||||||
|
super().__init__("ProcessPKT")
|
||||||
if not CONF.enable_sending_ack_packets:
|
if not CONF.enable_sending_ack_packets:
|
||||||
LOG.warning(
|
LOG.warning(
|
||||||
'Sending ack packets is disabled, messages will not be acknowledged.',
|
"Sending ack packets is disabled, messages "
|
||||||
|
"will not be acknowledged.",
|
||||||
)
|
)
|
||||||
|
|
||||||
def process_ack_packet(self, packet):
|
def process_ack_packet(self, packet):
|
||||||
"""We got an ack for a message, no need to resend it."""
|
"""We got an ack for a message, no need to resend it."""
|
||||||
ack_num = packet.msgNo
|
ack_num = packet.msgNo
|
||||||
LOG.debug(f'Got ack for message {ack_num}')
|
LOG.debug(f"Got ack for message {ack_num}")
|
||||||
collector.PacketCollector().rx(packet)
|
collector.PacketCollector().rx(packet)
|
||||||
|
|
||||||
def process_piggyback_ack(self, packet):
|
def process_piggyback_ack(self, packet):
|
||||||
"""We got an ack embedded in a packet."""
|
"""We got an ack embedded in a packet."""
|
||||||
ack_num = packet.ackMsgNo
|
ack_num = packet.ackMsgNo
|
||||||
LOG.debug(f'Got PiggyBackAck for message {ack_num}')
|
LOG.debug(f"Got PiggyBackAck for message {ack_num}")
|
||||||
collector.PacketCollector().rx(packet)
|
collector.PacketCollector().rx(packet)
|
||||||
|
|
||||||
def process_reject_packet(self, packet):
|
def process_reject_packet(self, packet):
|
||||||
"""We got a reject message for a packet. Stop sending the message."""
|
"""We got a reject message for a packet. Stop sending the message."""
|
||||||
ack_num = packet.msgNo
|
ack_num = packet.msgNo
|
||||||
LOG.debug(f'Got REJECT for message {ack_num}')
|
LOG.debug(f"Got REJECT for message {ack_num}")
|
||||||
collector.PacketCollector().rx(packet)
|
collector.PacketCollector().rx(packet)
|
||||||
|
|
||||||
|
def loop(self):
|
||||||
|
try:
|
||||||
|
packet = self.packet_queue.get(timeout=1)
|
||||||
|
if packet:
|
||||||
|
self.process_packet(packet)
|
||||||
|
except queue.Empty:
|
||||||
|
pass
|
||||||
|
return True
|
||||||
|
|
||||||
def process_packet(self, packet):
|
def process_packet(self, packet):
|
||||||
"""Process a packet received from aprs-is server."""
|
"""Process a packet received from aprs-is server."""
|
||||||
LOG.debug(f'ProcessPKT-LOOP {self.loop_count}')
|
LOG.debug(f"ProcessPKT-LOOP {self.loop_count}")
|
||||||
|
|
||||||
# set this now as we are going to process it.
|
|
||||||
# This is used during dupe checking, so set it early
|
|
||||||
packet.processed = True
|
|
||||||
|
|
||||||
our_call = CONF.callsign.lower()
|
our_call = CONF.callsign.lower()
|
||||||
|
|
||||||
from_call = packet.from_call
|
from_call = packet.from_call
|
||||||
if hasattr(packet, 'addresse') and packet.addresse:
|
if packet.addresse:
|
||||||
to_call = packet.addresse
|
to_call = packet.addresse
|
||||||
else:
|
else:
|
||||||
to_call = packet.to_call
|
to_call = packet.to_call
|
||||||
@ -237,7 +227,7 @@ class APRSDProcessPacketThread(APRSDFilterThread):
|
|||||||
):
|
):
|
||||||
self.process_reject_packet(packet)
|
self.process_reject_packet(packet)
|
||||||
else:
|
else:
|
||||||
if hasattr(packet, 'ackMsgNo') and packet.ackMsgNo:
|
if hasattr(packet, "ackMsgNo") and packet.ackMsgNo:
|
||||||
# we got an ack embedded in this packet
|
# we got an ack embedded in this packet
|
||||||
# we need to handle the ack
|
# we need to handle the ack
|
||||||
self.process_piggyback_ack(packet)
|
self.process_piggyback_ack(packet)
|
||||||
@ -277,7 +267,7 @@ class APRSDProcessPacketThread(APRSDFilterThread):
|
|||||||
if not for_us:
|
if not for_us:
|
||||||
LOG.info("Got a packet meant for someone else '{packet.to_call}'")
|
LOG.info("Got a packet meant for someone else '{packet.to_call}'")
|
||||||
else:
|
else:
|
||||||
LOG.info('Got a non AckPacket/MessagePacket')
|
LOG.info("Got a non AckPacket/MessagePacket")
|
||||||
|
|
||||||
|
|
||||||
class APRSDPluginProcessPacketThread(APRSDProcessPacketThread):
|
class APRSDPluginProcessPacketThread(APRSDProcessPacketThread):
|
||||||
@ -297,7 +287,7 @@ class APRSDPluginProcessPacketThread(APRSDProcessPacketThread):
|
|||||||
tx.send(subreply)
|
tx.send(subreply)
|
||||||
else:
|
else:
|
||||||
wl = CONF.watch_list
|
wl = CONF.watch_list
|
||||||
to_call = wl['alert_callsign']
|
to_call = wl["alert_callsign"]
|
||||||
tx.send(
|
tx.send(
|
||||||
packets.MessagePacket(
|
packets.MessagePacket(
|
||||||
from_call=CONF.callsign,
|
from_call=CONF.callsign,
|
||||||
@ -309,7 +299,7 @@ class APRSDPluginProcessPacketThread(APRSDProcessPacketThread):
|
|||||||
# We have a message based object.
|
# We have a message based object.
|
||||||
tx.send(reply)
|
tx.send(reply)
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
LOG.error('Plugin failed!!!')
|
LOG.error("Plugin failed!!!")
|
||||||
LOG.exception(ex)
|
LOG.exception(ex)
|
||||||
|
|
||||||
def process_our_message_packet(self, packet):
|
def process_our_message_packet(self, packet):
|
||||||
@ -365,11 +355,11 @@ class APRSDPluginProcessPacketThread(APRSDProcessPacketThread):
|
|||||||
if to_call == CONF.callsign and not replied:
|
if to_call == CONF.callsign and not replied:
|
||||||
# Tailor the messages accordingly
|
# Tailor the messages accordingly
|
||||||
if CONF.load_help_plugin:
|
if CONF.load_help_plugin:
|
||||||
LOG.warning('Sending help!')
|
LOG.warning("Sending help!")
|
||||||
message_text = "Unknown command! Send 'help' message for help"
|
message_text = "Unknown command! Send 'help' message for help"
|
||||||
else:
|
else:
|
||||||
LOG.warning('Unknown command!')
|
LOG.warning("Unknown command!")
|
||||||
message_text = 'Unknown command!'
|
message_text = "Unknown command!"
|
||||||
|
|
||||||
tx.send(
|
tx.send(
|
||||||
packets.MessagePacket(
|
packets.MessagePacket(
|
||||||
@ -379,11 +369,11 @@ class APRSDPluginProcessPacketThread(APRSDProcessPacketThread):
|
|||||||
),
|
),
|
||||||
)
|
)
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
LOG.error('Plugin failed!!!')
|
LOG.error("Plugin failed!!!")
|
||||||
LOG.exception(ex)
|
LOG.exception(ex)
|
||||||
# Do we need to send a reply?
|
# Do we need to send a reply?
|
||||||
if to_call == CONF.callsign:
|
if to_call == CONF.callsign:
|
||||||
reply = 'A Plugin failed! try again?'
|
reply = "A Plugin failed! try again?"
|
||||||
tx.send(
|
tx.send(
|
||||||
packets.MessagePacket(
|
packets.MessagePacket(
|
||||||
from_call=CONF.callsign,
|
from_call=CONF.callsign,
|
||||||
@ -392,4 +382,4 @@ class APRSDPluginProcessPacketThread(APRSDProcessPacketThread):
|
|||||||
),
|
),
|
||||||
)
|
)
|
||||||
|
|
||||||
LOG.debug('Completed process_our_message_packet')
|
LOG.debug("Completed process_our_message_packet")
|
||||||
|
@ -1,42 +0,0 @@
|
|||||||
# aprsd/aprsd/threads/service.py
|
|
||||||
#
|
|
||||||
# This module is used to register threads that the service command runs.
|
|
||||||
#
|
|
||||||
# The service command is used to start and stop the APRS service.
|
|
||||||
# This is a mechanism to register threads that the service or command
|
|
||||||
# needs to run, and then start stop them as needed.
|
|
||||||
|
|
||||||
from aprsd.threads import aprsd as aprsd_threads
|
|
||||||
from aprsd.utils import singleton
|
|
||||||
|
|
||||||
|
|
||||||
@singleton
|
|
||||||
class ServiceThreads:
|
|
||||||
"""Registry for threads that the service command runs.
|
|
||||||
|
|
||||||
This enables extensions to register a thread to run during
|
|
||||||
the service command.
|
|
||||||
"""
|
|
||||||
|
|
||||||
def __init__(self):
|
|
||||||
self.threads: list[aprsd_threads.APRSDThread] = []
|
|
||||||
|
|
||||||
def register(self, thread: aprsd_threads.APRSDThread):
|
|
||||||
if not isinstance(thread, aprsd_threads.APRSDThread):
|
|
||||||
raise TypeError(f'Thread {thread} is not an APRSDThread')
|
|
||||||
self.threads.append(thread)
|
|
||||||
|
|
||||||
def unregister(self, thread: aprsd_threads.APRSDThread):
|
|
||||||
if not isinstance(thread, aprsd_threads.APRSDThread):
|
|
||||||
raise TypeError(f'Thread {thread} is not an APRSDThread')
|
|
||||||
self.threads.remove(thread)
|
|
||||||
|
|
||||||
def start(self):
|
|
||||||
"""Start all threads in the list."""
|
|
||||||
for thread in self.threads:
|
|
||||||
thread.start()
|
|
||||||
|
|
||||||
def join(self):
|
|
||||||
"""Join all the threads in the list"""
|
|
||||||
for thread in self.threads:
|
|
||||||
thread.join()
|
|
@ -1,6 +1,8 @@
|
|||||||
import logging
|
import logging
|
||||||
|
import threading
|
||||||
import time
|
import time
|
||||||
|
|
||||||
|
import wrapt
|
||||||
from oslo_config import cfg
|
from oslo_config import cfg
|
||||||
|
|
||||||
from aprsd.stats import collector
|
from aprsd.stats import collector
|
||||||
@ -8,15 +10,18 @@ from aprsd.threads import APRSDThread
|
|||||||
from aprsd.utils import objectstore
|
from aprsd.utils import objectstore
|
||||||
|
|
||||||
CONF = cfg.CONF
|
CONF = cfg.CONF
|
||||||
LOG = logging.getLogger('APRSD')
|
LOG = logging.getLogger("APRSD")
|
||||||
|
|
||||||
|
|
||||||
class StatsStore(objectstore.ObjectStoreMixin):
|
class StatsStore(objectstore.ObjectStoreMixin):
|
||||||
"""Container to save the stats from the collector."""
|
"""Container to save the stats from the collector."""
|
||||||
|
|
||||||
|
lock = threading.Lock()
|
||||||
|
data = {}
|
||||||
|
|
||||||
|
@wrapt.synchronized(lock)
|
||||||
def add(self, stats: dict):
|
def add(self, stats: dict):
|
||||||
with self.lock:
|
self.data = stats
|
||||||
self.data = stats
|
|
||||||
|
|
||||||
|
|
||||||
class APRSDStatsStoreThread(APRSDThread):
|
class APRSDStatsStoreThread(APRSDThread):
|
||||||
@ -26,7 +31,7 @@ class APRSDStatsStoreThread(APRSDThread):
|
|||||||
save_interval = 10
|
save_interval = 10
|
||||||
|
|
||||||
def __init__(self):
|
def __init__(self):
|
||||||
super().__init__('StatsStore')
|
super().__init__("StatsStore")
|
||||||
|
|
||||||
def loop(self):
|
def loop(self):
|
||||||
if self.loop_count % self.save_interval == 0:
|
if self.loop_count % self.save_interval == 0:
|
||||||
|
@ -11,12 +11,12 @@ from rush.stores import dictionary
|
|||||||
|
|
||||||
from aprsd import conf # noqa
|
from aprsd import conf # noqa
|
||||||
from aprsd import threads as aprsd_threads
|
from aprsd import threads as aprsd_threads
|
||||||
from aprsd.client.client import APRSDClient
|
from aprsd.client import client_factory
|
||||||
from aprsd.packets import collector, core, tracker
|
from aprsd.packets import collector, core, tracker
|
||||||
from aprsd.packets import log as packet_log
|
from aprsd.packets import log as packet_log
|
||||||
|
|
||||||
CONF = cfg.CONF
|
CONF = cfg.CONF
|
||||||
LOG = logging.getLogger('APRSD')
|
LOG = logging.getLogger("APRSD")
|
||||||
|
|
||||||
msg_t = throttle.Throttle(
|
msg_t = throttle.Throttle(
|
||||||
limiter=periodic.PeriodicLimiter(
|
limiter=periodic.PeriodicLimiter(
|
||||||
@ -54,7 +54,7 @@ def send(packet: core.Packet, direct=False, aprs_client=None):
|
|||||||
if CONF.enable_sending_ack_packets:
|
if CONF.enable_sending_ack_packets:
|
||||||
_send_ack(packet, direct=direct, aprs_client=aprs_client)
|
_send_ack(packet, direct=direct, aprs_client=aprs_client)
|
||||||
else:
|
else:
|
||||||
LOG.info('Sending ack packets is disabled. Not sending AckPacket.')
|
LOG.info("Sending ack packets is disabled. Not sending AckPacket.")
|
||||||
else:
|
else:
|
||||||
_send_packet(packet, direct=direct, aprs_client=aprs_client)
|
_send_packet(packet, direct=direct, aprs_client=aprs_client)
|
||||||
|
|
||||||
@ -81,14 +81,14 @@ def _send_direct(packet, aprs_client=None):
|
|||||||
if aprs_client:
|
if aprs_client:
|
||||||
cl = aprs_client
|
cl = aprs_client
|
||||||
else:
|
else:
|
||||||
cl = APRSDClient()
|
cl = client_factory.create()
|
||||||
|
|
||||||
packet.update_timestamp()
|
packet.update_timestamp()
|
||||||
packet_log.log(packet, tx=True)
|
packet_log.log(packet, tx=True)
|
||||||
try:
|
try:
|
||||||
cl.send(packet)
|
cl.send(packet)
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
LOG.error(f'Failed to send packet: {packet}')
|
LOG.error(f"Failed to send packet: {packet}")
|
||||||
LOG.error(e)
|
LOG.error(e)
|
||||||
return False
|
return False
|
||||||
else:
|
else:
|
||||||
@ -100,7 +100,7 @@ class SendPacketThread(aprsd_threads.APRSDThread):
|
|||||||
|
|
||||||
def __init__(self, packet):
|
def __init__(self, packet):
|
||||||
self.packet = packet
|
self.packet = packet
|
||||||
super().__init__(f'TX-{packet.to_call}-{self.packet.msgNo}')
|
super().__init__(f"TX-{packet.to_call}-{self.packet.msgNo}")
|
||||||
|
|
||||||
def loop(self):
|
def loop(self):
|
||||||
"""Loop until a message is acked or it gets delayed.
|
"""Loop until a message is acked or it gets delayed.
|
||||||
@ -119,9 +119,9 @@ class SendPacketThread(aprsd_threads.APRSDThread):
|
|||||||
# The message has been removed from the tracking queue
|
# The message has been removed from the tracking queue
|
||||||
# So it got acked and we are done.
|
# So it got acked and we are done.
|
||||||
LOG.info(
|
LOG.info(
|
||||||
f'{self.packet.__class__.__name__}'
|
f"{self.packet.__class__.__name__}"
|
||||||
f'({self.packet.msgNo}) '
|
f"({self.packet.msgNo}) "
|
||||||
'Message Send Complete via Ack.',
|
"Message Send Complete via Ack.",
|
||||||
)
|
)
|
||||||
return False
|
return False
|
||||||
else:
|
else:
|
||||||
@ -130,10 +130,10 @@ class SendPacketThread(aprsd_threads.APRSDThread):
|
|||||||
# we reached the send limit, don't send again
|
# we reached the send limit, don't send again
|
||||||
# TODO(hemna) - Need to put this in a delayed queue?
|
# TODO(hemna) - Need to put this in a delayed queue?
|
||||||
LOG.info(
|
LOG.info(
|
||||||
f'{packet.__class__.__name__} '
|
f"{packet.__class__.__name__} "
|
||||||
f'({packet.msgNo}) '
|
f"({packet.msgNo}) "
|
||||||
'Message Send Complete. Max attempts reached'
|
"Message Send Complete. Max attempts reached"
|
||||||
f' {packet.retry_count}',
|
f" {packet.retry_count}",
|
||||||
)
|
)
|
||||||
pkt_tracker.remove(packet.msgNo)
|
pkt_tracker.remove(packet.msgNo)
|
||||||
return False
|
return False
|
||||||
@ -157,9 +157,8 @@ class SendPacketThread(aprsd_threads.APRSDThread):
|
|||||||
sent = False
|
sent = False
|
||||||
try:
|
try:
|
||||||
sent = _send_direct(packet)
|
sent = _send_direct(packet)
|
||||||
except Exception as ex:
|
except Exception:
|
||||||
LOG.error(f'Failed to send packet: {packet}')
|
LOG.error(f"Failed to send packet: {packet}")
|
||||||
LOG.error(ex)
|
|
||||||
else:
|
else:
|
||||||
# If an exception happens while sending
|
# If an exception happens while sending
|
||||||
# we don't want this attempt to count
|
# we don't want this attempt to count
|
||||||
@ -179,7 +178,7 @@ class SendAckThread(aprsd_threads.APRSDThread):
|
|||||||
|
|
||||||
def __init__(self, packet):
|
def __init__(self, packet):
|
||||||
self.packet = packet
|
self.packet = packet
|
||||||
super().__init__(f'TXAck-{packet.to_call}-{self.packet.msgNo}')
|
super().__init__(f"TXAck-{packet.to_call}-{self.packet.msgNo}")
|
||||||
self.max_retries = CONF.default_ack_send_count
|
self.max_retries = CONF.default_ack_send_count
|
||||||
|
|
||||||
def loop(self):
|
def loop(self):
|
||||||
@ -189,10 +188,10 @@ class SendAckThread(aprsd_threads.APRSDThread):
|
|||||||
# we reached the send limit, don't send again
|
# we reached the send limit, don't send again
|
||||||
# TODO(hemna) - Need to put this in a delayed queue?
|
# TODO(hemna) - Need to put this in a delayed queue?
|
||||||
LOG.debug(
|
LOG.debug(
|
||||||
f'{self.packet.__class__.__name__}'
|
f"{self.packet.__class__.__name__}"
|
||||||
f'({self.packet.msgNo}) '
|
f"({self.packet.msgNo}) "
|
||||||
'Send Complete. Max attempts reached'
|
"Send Complete. Max attempts reached"
|
||||||
f' {self.max_retries}',
|
f" {self.max_retries}",
|
||||||
)
|
)
|
||||||
return False
|
return False
|
||||||
|
|
||||||
@ -208,7 +207,7 @@ class SendAckThread(aprsd_threads.APRSDThread):
|
|||||||
# It's time to try to send it again
|
# It's time to try to send it again
|
||||||
send_now = True
|
send_now = True
|
||||||
elif self.loop_count % 10 == 0:
|
elif self.loop_count % 10 == 0:
|
||||||
LOG.debug(f'Still wating. {delta}')
|
LOG.debug(f"Still wating. {delta}")
|
||||||
else:
|
else:
|
||||||
send_now = True
|
send_now = True
|
||||||
|
|
||||||
@ -217,7 +216,7 @@ class SendAckThread(aprsd_threads.APRSDThread):
|
|||||||
try:
|
try:
|
||||||
sent = _send_direct(self.packet)
|
sent = _send_direct(self.packet)
|
||||||
except Exception:
|
except Exception:
|
||||||
LOG.error(f'Failed to send packet: {self.packet}')
|
LOG.error(f"Failed to send packet: {self.packet}")
|
||||||
else:
|
else:
|
||||||
# If an exception happens while sending
|
# If an exception happens while sending
|
||||||
# we don't want this attempt to count
|
# we don't want this attempt to count
|
||||||
@ -241,18 +240,18 @@ class BeaconSendThread(aprsd_threads.APRSDThread):
|
|||||||
_loop_cnt: int = 1
|
_loop_cnt: int = 1
|
||||||
|
|
||||||
def __init__(self):
|
def __init__(self):
|
||||||
super().__init__('BeaconSendThread')
|
super().__init__("BeaconSendThread")
|
||||||
self._loop_cnt = 1
|
self._loop_cnt = 1
|
||||||
# Make sure Latitude and Longitude are set.
|
# Make sure Latitude and Longitude are set.
|
||||||
if not CONF.latitude or not CONF.longitude:
|
if not CONF.latitude or not CONF.longitude:
|
||||||
LOG.error(
|
LOG.error(
|
||||||
'Latitude and Longitude are not set in the config file.'
|
"Latitude and Longitude are not set in the config file."
|
||||||
'Beacon will not be sent and thread is STOPPED.',
|
"Beacon will not be sent and thread is STOPPED.",
|
||||||
)
|
)
|
||||||
self.stop()
|
self.stop()
|
||||||
LOG.info(
|
LOG.info(
|
||||||
'Beacon thread is running and will send '
|
"Beacon thread is running and will send "
|
||||||
f'beacons every {CONF.beacon_interval} seconds.',
|
f"beacons every {CONF.beacon_interval} seconds.",
|
||||||
)
|
)
|
||||||
|
|
||||||
def loop(self):
|
def loop(self):
|
||||||
@ -260,10 +259,10 @@ class BeaconSendThread(aprsd_threads.APRSDThread):
|
|||||||
if self._loop_cnt % CONF.beacon_interval == 0:
|
if self._loop_cnt % CONF.beacon_interval == 0:
|
||||||
pkt = core.BeaconPacket(
|
pkt = core.BeaconPacket(
|
||||||
from_call=CONF.callsign,
|
from_call=CONF.callsign,
|
||||||
to_call='APRS',
|
to_call="APRS",
|
||||||
latitude=float(CONF.latitude),
|
latitude=float(CONF.latitude),
|
||||||
longitude=float(CONF.longitude),
|
longitude=float(CONF.longitude),
|
||||||
comment='APRSD GPS Beacon',
|
comment="APRSD GPS Beacon",
|
||||||
symbol=CONF.beacon_symbol,
|
symbol=CONF.beacon_symbol,
|
||||||
)
|
)
|
||||||
try:
|
try:
|
||||||
@ -271,8 +270,8 @@ class BeaconSendThread(aprsd_threads.APRSDThread):
|
|||||||
pkt.retry_count = 1
|
pkt.retry_count = 1
|
||||||
send(pkt, direct=True)
|
send(pkt, direct=True)
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
LOG.error(f'Failed to send beacon: {e}')
|
LOG.error(f"Failed to send beacon: {e}")
|
||||||
APRSDClient().reset()
|
client_factory.create().reset()
|
||||||
time.sleep(5)
|
time.sleep(5)
|
||||||
|
|
||||||
self._loop_cnt += 1
|
self._loop_cnt += 1
|
||||||
|
@ -3,7 +3,7 @@ from typing import Callable, Protocol, runtime_checkable
|
|||||||
|
|
||||||
from aprsd.utils import singleton
|
from aprsd.utils import singleton
|
||||||
|
|
||||||
LOG = logging.getLogger('APRSD')
|
LOG = logging.getLogger("APRSD")
|
||||||
|
|
||||||
|
|
||||||
@runtime_checkable
|
@runtime_checkable
|
||||||
@ -33,8 +33,7 @@ class KeepAliveCollector:
|
|||||||
try:
|
try:
|
||||||
cls.keepalive_check()
|
cls.keepalive_check()
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
LOG.error(f'Error in producer {name} (check): {e}')
|
LOG.error(f"Error in producer {name} (check): {e}")
|
||||||
raise e
|
|
||||||
|
|
||||||
def log(self) -> None:
|
def log(self) -> None:
|
||||||
"""Log any relevant information during a KeepAlive check"""
|
"""Log any relevant information during a KeepAlive check"""
|
||||||
@ -43,15 +42,14 @@ class KeepAliveCollector:
|
|||||||
try:
|
try:
|
||||||
cls.keepalive_log()
|
cls.keepalive_log()
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
LOG.error(f'Error in producer {name} (check): {e}')
|
LOG.error(f"Error in producer {name} (check): {e}")
|
||||||
raise e
|
|
||||||
|
|
||||||
def register(self, producer_name: Callable):
|
def register(self, producer_name: Callable):
|
||||||
if not isinstance(producer_name, KeepAliveProducer):
|
if not isinstance(producer_name, KeepAliveProducer):
|
||||||
raise TypeError(f'Producer {producer_name} is not a KeepAliveProducer')
|
raise TypeError(f"Producer {producer_name} is not a KeepAliveProducer")
|
||||||
self.producers.append(producer_name)
|
self.producers.append(producer_name)
|
||||||
|
|
||||||
def unregister(self, producer_name: Callable):
|
def unregister(self, producer_name: Callable):
|
||||||
if not isinstance(producer_name, KeepAliveProducer):
|
if not isinstance(producer_name, KeepAliveProducer):
|
||||||
raise TypeError(f'Producer {producer_name} is not a KeepAliveProducer')
|
raise TypeError(f"Producer {producer_name} is not a KeepAliveProducer")
|
||||||
self.producers.remove(producer_name)
|
self.producers.remove(producer_name)
|
||||||
|
@ -6,8 +6,9 @@ import threading
|
|||||||
|
|
||||||
from oslo_config import cfg
|
from oslo_config import cfg
|
||||||
|
|
||||||
|
|
||||||
CONF = cfg.CONF
|
CONF = cfg.CONF
|
||||||
LOG = logging.getLogger('APRSD')
|
LOG = logging.getLogger("APRSD")
|
||||||
|
|
||||||
|
|
||||||
class ObjectStoreMixin:
|
class ObjectStoreMixin:
|
||||||
@ -62,7 +63,7 @@ class ObjectStoreMixin:
|
|||||||
def _save_filename(self):
|
def _save_filename(self):
|
||||||
save_location = CONF.save_location
|
save_location = CONF.save_location
|
||||||
|
|
||||||
return '{}/{}.p'.format(
|
return "{}/{}.p".format(
|
||||||
save_location,
|
save_location,
|
||||||
self.__class__.__name__.lower(),
|
self.__class__.__name__.lower(),
|
||||||
)
|
)
|
||||||
@ -74,13 +75,13 @@ class ObjectStoreMixin:
|
|||||||
self._init_store()
|
self._init_store()
|
||||||
save_filename = self._save_filename()
|
save_filename = self._save_filename()
|
||||||
if len(self) > 0:
|
if len(self) > 0:
|
||||||
LOG.debug(
|
LOG.info(
|
||||||
f'{self.__class__.__name__}::Saving'
|
f"{self.__class__.__name__}::Saving"
|
||||||
f' {len(self)} entries to disk at '
|
f" {len(self)} entries to disk at "
|
||||||
f'{save_filename}',
|
f"{save_filename}",
|
||||||
)
|
)
|
||||||
with self.lock:
|
with self.lock:
|
||||||
with open(save_filename, 'wb+') as fp:
|
with open(save_filename, "wb+") as fp:
|
||||||
pickle.dump(self.data, fp)
|
pickle.dump(self.data, fp)
|
||||||
else:
|
else:
|
||||||
LOG.debug(
|
LOG.debug(
|
||||||
@ -96,21 +97,21 @@ class ObjectStoreMixin:
|
|||||||
return
|
return
|
||||||
if os.path.exists(self._save_filename()):
|
if os.path.exists(self._save_filename()):
|
||||||
try:
|
try:
|
||||||
with open(self._save_filename(), 'rb') as fp:
|
with open(self._save_filename(), "rb") as fp:
|
||||||
raw = pickle.load(fp)
|
raw = pickle.load(fp)
|
||||||
if raw:
|
if raw:
|
||||||
self.data = raw
|
self.data = raw
|
||||||
LOG.debug(
|
LOG.debug(
|
||||||
f'{self.__class__.__name__}::Loaded {len(self)} entries from disk.',
|
f"{self.__class__.__name__}::Loaded {len(self)} entries from disk.",
|
||||||
)
|
)
|
||||||
else:
|
else:
|
||||||
LOG.debug(f'{self.__class__.__name__}::No data to load.')
|
LOG.debug(f"{self.__class__.__name__}::No data to load.")
|
||||||
except (pickle.UnpicklingError, Exception) as ex:
|
except (pickle.UnpicklingError, Exception) as ex:
|
||||||
LOG.error(f'Failed to UnPickle {self._save_filename()}')
|
LOG.error(f"Failed to UnPickle {self._save_filename()}")
|
||||||
LOG.error(ex)
|
LOG.error(ex)
|
||||||
self.data = {}
|
self.data = {}
|
||||||
else:
|
else:
|
||||||
LOG.debug(f'{self.__class__.__name__}::No save file found.')
|
LOG.debug(f"{self.__class__.__name__}::No save file found.")
|
||||||
|
|
||||||
def flush(self):
|
def flush(self):
|
||||||
"""Nuke the old pickle file that stored the old results from last aprsd run."""
|
"""Nuke the old pickle file that stored the old results from last aprsd run."""
|
||||||
|
@ -1,27 +1,14 @@
|
|||||||
|
version: "3"
|
||||||
services:
|
services:
|
||||||
aprsd:
|
aprsd:
|
||||||
image: hemna6969/aprsd:latest
|
image: hemna6969/aprsd:latest
|
||||||
container_name: aprsd-server
|
container_name: aprsd
|
||||||
volumes:
|
ports:
|
||||||
- $HOME/.config/aprsd/:/config # left side of the : is your directory where your config is
|
- "8001:8001"
|
||||||
# outside of your container. Your normal filesystem.
|
volumes:
|
||||||
restart: unless-stopped
|
- $HOME/.config/aprsd:/config
|
||||||
environment:
|
restart: unless-stopped
|
||||||
- TZ=America/New_York
|
environment:
|
||||||
- APRSD_PLUGINS=aprsd-email-plugin,aprsd-borat-plugin
|
- TZ=America/New_York
|
||||||
- LOG_LEVEL=ERROR
|
- APRSD_PLUGINS=aprsd-slack-plugin>=1.0.2
|
||||||
|
- LOG_LEVEL=ERROR
|
||||||
aprsd-admin: # Admin interface
|
|
||||||
image: hemna6969/aprsd:latest
|
|
||||||
container_name: aprsd-admin
|
|
||||||
volumes:
|
|
||||||
- $HOME/.config/aprsd/:/config # left side of the : is your directory where your config is
|
|
||||||
# outside of your container. Your normal filesystem.
|
|
||||||
restart: unless-stopped
|
|
||||||
ports:
|
|
||||||
- 8001:8001 # left side of the : is your port on your host that you can access
|
|
||||||
# the web interface for the admin interface.
|
|
||||||
entrypoint: /app/admin.sh
|
|
||||||
environment:
|
|
||||||
- TZ=America/New_York
|
|
||||||
- APRSD_EXTENSIONS=git+https://github.com/hemna/aprsd-admin-extension.git
|
|
||||||
|
@ -1,10 +1,10 @@
|
|||||||
# This file was autogenerated by uv via the following command:
|
# This file was autogenerated by uv via the following command:
|
||||||
# uv pip compile --resolver backtracking --annotation-style=line requirements-dev.in -o requirements-dev.txt
|
# uv pip compile --resolver backtracking --annotation-style=line requirements-dev.in -o requirements-dev.txt
|
||||||
alabaster==1.0.0 # via sphinx
|
alabaster==1.0.0 # via sphinx
|
||||||
babel==2.17.0 # via sphinx
|
babel==2.16.0 # via sphinx
|
||||||
build==1.2.2.post1 # via pip-tools, -r requirements-dev.in
|
build==1.2.2.post1 # via pip-tools, -r requirements-dev.in
|
||||||
cachetools==5.5.2 # via tox
|
cachetools==5.5.1 # via tox
|
||||||
certifi==2025.1.31 # via requests
|
certifi==2024.12.14 # via requests
|
||||||
cfgv==3.4.0 # via pre-commit
|
cfgv==3.4.0 # via pre-commit
|
||||||
chardet==5.2.0 # via tox
|
chardet==5.2.0 # via tox
|
||||||
charset-normalizer==3.4.1 # via requests
|
charset-normalizer==3.4.1 # via requests
|
||||||
@ -12,37 +12,38 @@ click==8.1.8 # via pip-tools
|
|||||||
colorama==0.4.6 # via tox
|
colorama==0.4.6 # via tox
|
||||||
distlib==0.3.9 # via virtualenv
|
distlib==0.3.9 # via virtualenv
|
||||||
docutils==0.21.2 # via m2r, sphinx
|
docutils==0.21.2 # via m2r, sphinx
|
||||||
filelock==3.18.0 # via tox, virtualenv
|
filelock==3.17.0 # via tox, virtualenv
|
||||||
identify==2.6.10 # via pre-commit
|
identify==2.6.6 # via pre-commit
|
||||||
idna==3.10 # via requests
|
idna==3.10 # via requests
|
||||||
imagesize==1.4.1 # via sphinx
|
imagesize==1.4.1 # via sphinx
|
||||||
jinja2==3.1.6 # via sphinx
|
jinja2==3.1.5 # via sphinx
|
||||||
m2r==0.3.1 # via -r requirements-dev.in
|
m2r==0.3.1 # via -r requirements-dev.in
|
||||||
markupsafe==3.0.2 # via jinja2
|
markupsafe==3.0.2 # via jinja2
|
||||||
mistune==0.8.4 # via m2r
|
mistune==0.8.4 # via m2r
|
||||||
nodeenv==1.9.1 # via pre-commit
|
nodeenv==1.9.1 # via pre-commit
|
||||||
packaging==25.0 # via build, pyproject-api, sphinx, tox
|
packaging==24.2 # via build, pyproject-api, sphinx, tox
|
||||||
pip==25.0.1 # via pip-tools, -r requirements-dev.in
|
pip==24.3.1 # via pip-tools, -r requirements-dev.in
|
||||||
pip-tools==7.4.1 # via -r requirements-dev.in
|
pip-tools==7.4.1 # via -r requirements-dev.in
|
||||||
platformdirs==4.3.7 # via tox, virtualenv
|
platformdirs==4.3.6 # via tox, virtualenv
|
||||||
pluggy==1.5.0 # via tox
|
pluggy==1.5.0 # via tox
|
||||||
pre-commit==4.2.0 # via -r requirements-dev.in
|
pre-commit==4.1.0 # via -r requirements-dev.in
|
||||||
pygments==2.19.1 # via sphinx
|
pygments==2.19.1 # via sphinx
|
||||||
pyproject-api==1.9.0 # via tox
|
pyproject-api==1.9.0 # via tox
|
||||||
pyproject-hooks==1.2.0 # via build, pip-tools
|
pyproject-hooks==1.2.0 # via build, pip-tools
|
||||||
pyyaml==6.0.2 # via pre-commit
|
pyyaml==6.0.2 # via pre-commit
|
||||||
requests==2.32.3 # via sphinx
|
requests==2.32.3 # via sphinx
|
||||||
roman-numerals-py==3.1.0 # via sphinx
|
setuptools==75.8.0 # via pip-tools
|
||||||
setuptools==79.0.1 # via pip-tools
|
|
||||||
snowballstemmer==2.2.0 # via sphinx
|
snowballstemmer==2.2.0 # via sphinx
|
||||||
sphinx==8.2.3 # via -r requirements-dev.in
|
sphinx==8.1.3 # via -r requirements-dev.in
|
||||||
sphinxcontrib-applehelp==2.0.0 # via sphinx
|
sphinxcontrib-applehelp==2.0.0 # via sphinx
|
||||||
sphinxcontrib-devhelp==2.0.0 # via sphinx
|
sphinxcontrib-devhelp==2.0.0 # via sphinx
|
||||||
sphinxcontrib-htmlhelp==2.1.0 # via sphinx
|
sphinxcontrib-htmlhelp==2.1.0 # via sphinx
|
||||||
sphinxcontrib-jsmath==1.0.1 # via sphinx
|
sphinxcontrib-jsmath==1.0.1 # via sphinx
|
||||||
sphinxcontrib-qthelp==2.0.0 # via sphinx
|
sphinxcontrib-qthelp==2.0.0 # via sphinx
|
||||||
sphinxcontrib-serializinghtml==2.0.0 # via sphinx
|
sphinxcontrib-serializinghtml==2.0.0 # via sphinx
|
||||||
tox==4.25.0 # via -r requirements-dev.in
|
tomli==2.2.1 # via build, pip-tools, pyproject-api, sphinx, tox
|
||||||
urllib3==2.4.0 # via requests
|
tox==4.24.1 # via -r requirements-dev.in
|
||||||
virtualenv==20.30.0 # via pre-commit, tox
|
typing-extensions==4.12.2 # via tox
|
||||||
|
urllib3==2.3.0 # via requests
|
||||||
|
virtualenv==20.29.1 # via pre-commit, tox
|
||||||
wheel==0.45.1 # via pip-tools, -r requirements-dev.in
|
wheel==0.45.1 # via pip-tools, -r requirements-dev.in
|
||||||
|
@ -7,7 +7,8 @@ loguru
|
|||||||
oslo.config
|
oslo.config
|
||||||
pluggy
|
pluggy
|
||||||
requests
|
requests
|
||||||
rich
|
# Pinned due to gray needing 12.6.0
|
||||||
|
rich~=12.6.0
|
||||||
rush
|
rush
|
||||||
thesmuggler
|
thesmuggler
|
||||||
tzlocal
|
tzlocal
|
||||||
|
@ -1,12 +1,13 @@
|
|||||||
# This file was autogenerated by uv via the following command:
|
# This file was autogenerated by uv via the following command:
|
||||||
# uv pip compile --resolver backtracking --annotation-style=line requirements.in -o requirements.txt
|
# uv pip compile --resolver backtracking --annotation-style=line requirements.in -o requirements.txt
|
||||||
aprslib==0.7.2 # via -r requirements.in
|
aprslib==0.7.2 # via -r requirements.in
|
||||||
attrs==25.3.0 # via ax253, kiss3, rush
|
attrs==24.3.0 # via ax253, kiss3, rush
|
||||||
ax253==0.1.5.post1 # via kiss3
|
ax253==0.1.5.post1 # via kiss3
|
||||||
bitarray==3.3.1 # via ax253, kiss3
|
bitarray==3.0.0 # via ax253, kiss3
|
||||||
certifi==2025.1.31 # via requests
|
certifi==2024.12.14 # via requests
|
||||||
charset-normalizer==3.4.1 # via requests
|
charset-normalizer==3.4.1 # via requests
|
||||||
click==8.1.8 # via -r requirements.in
|
click==8.1.8 # via -r requirements.in
|
||||||
|
commonmark==0.9.1 # via rich
|
||||||
dataclasses-json==0.6.7 # via -r requirements.in
|
dataclasses-json==0.6.7 # via -r requirements.in
|
||||||
debtcollector==3.0.0 # via oslo-config
|
debtcollector==3.0.0 # via oslo-config
|
||||||
haversine==2.9.0 # via -r requirements.in
|
haversine==2.9.0 # via -r requirements.in
|
||||||
@ -14,33 +15,30 @@ idna==3.10 # via requests
|
|||||||
importlib-metadata==8.6.1 # via ax253, kiss3
|
importlib-metadata==8.6.1 # via ax253, kiss3
|
||||||
kiss3==8.0.0 # via -r requirements.in
|
kiss3==8.0.0 # via -r requirements.in
|
||||||
loguru==0.7.3 # via -r requirements.in
|
loguru==0.7.3 # via -r requirements.in
|
||||||
markdown-it-py==3.0.0 # via rich
|
marshmallow==3.26.0 # via dataclasses-json
|
||||||
marshmallow==3.26.1 # via dataclasses-json
|
mypy-extensions==1.0.0 # via typing-inspect
|
||||||
mdurl==0.1.2 # via markdown-it-py
|
|
||||||
mypy-extensions==1.1.0 # via typing-inspect
|
|
||||||
netaddr==1.3.0 # via oslo-config
|
netaddr==1.3.0 # via oslo-config
|
||||||
oslo-config==9.7.1 # via -r requirements.in
|
oslo-config==9.7.0 # via -r requirements.in
|
||||||
oslo-i18n==6.5.1 # via oslo-config
|
oslo-i18n==6.5.0 # via oslo-config
|
||||||
packaging==25.0 # via marshmallow
|
packaging==24.2 # via marshmallow
|
||||||
pbr==6.1.1 # via oslo-i18n, stevedore
|
pbr==6.1.0 # via oslo-i18n, stevedore
|
||||||
pluggy==1.5.0 # via -r requirements.in
|
pluggy==1.5.0 # via -r requirements.in
|
||||||
pygments==2.19.1 # via rich
|
pygments==2.19.1 # via rich
|
||||||
pyserial==3.5 # via pyserial-asyncio
|
pyserial==3.5 # via pyserial-asyncio
|
||||||
pyserial-asyncio==0.6 # via kiss3
|
pyserial-asyncio==0.6 # via kiss3
|
||||||
pytz==2025.2 # via -r requirements.in
|
pytz==2024.2 # via -r requirements.in
|
||||||
pyyaml==6.0.2 # via oslo-config
|
pyyaml==6.0.2 # via oslo-config
|
||||||
requests==2.32.3 # via oslo-config, update-checker, -r requirements.in
|
requests==2.32.3 # via oslo-config, update-checker, -r requirements.in
|
||||||
rfc3986==2.0.0 # via oslo-config
|
rfc3986==2.0.0 # via oslo-config
|
||||||
rich==14.0.0 # via -r requirements.in
|
rich==12.6.0 # via -r requirements.in
|
||||||
rush==2021.4.0 # via -r requirements.in
|
rush==2021.4.0 # via -r requirements.in
|
||||||
setuptools==79.0.1 # via pbr
|
stevedore==5.4.0 # via oslo-config
|
||||||
stevedore==5.4.1 # via oslo-config
|
|
||||||
thesmuggler==1.0.1 # via -r requirements.in
|
thesmuggler==1.0.1 # via -r requirements.in
|
||||||
timeago==1.0.16 # via -r requirements.in
|
timeago==1.0.16 # via -r requirements.in
|
||||||
typing-extensions==4.13.2 # via typing-inspect
|
typing-extensions==4.12.2 # via typing-inspect
|
||||||
typing-inspect==0.9.0 # via dataclasses-json
|
typing-inspect==0.9.0 # via dataclasses-json
|
||||||
tzlocal==5.3.1 # via -r requirements.in
|
tzlocal==5.2 # via -r requirements.in
|
||||||
update-checker==0.18.0 # via -r requirements.in
|
update-checker==0.18.0 # via -r requirements.in
|
||||||
urllib3==2.4.0 # via requests
|
urllib3==2.3.0 # via requests
|
||||||
wrapt==1.17.2 # via debtcollector, -r requirements.in
|
wrapt==1.17.2 # via debtcollector, -r requirements.in
|
||||||
zipp==3.21.0 # via importlib-metadata
|
zipp==3.21.0 # via importlib-metadata
|
||||||
|
@ -1,440 +0,0 @@
|
|||||||
import datetime
|
|
||||||
import unittest
|
|
||||||
from unittest import mock
|
|
||||||
|
|
||||||
from aprslib.exceptions import LoginError
|
|
||||||
|
|
||||||
from aprsd import exception
|
|
||||||
from aprsd.client.drivers.aprsis import APRSISDriver
|
|
||||||
from aprsd.client.drivers.registry import ClientDriver
|
|
||||||
from aprsd.packets import core
|
|
||||||
|
|
||||||
|
|
||||||
class TestAPRSISDriver(unittest.TestCase):
|
|
||||||
"""Unit tests for the APRSISDriver class."""
|
|
||||||
|
|
||||||
def setUp(self):
|
|
||||||
# Mock configuration
|
|
||||||
self.conf_patcher = mock.patch('aprsd.client.drivers.aprsis.CONF')
|
|
||||||
self.mock_conf = self.conf_patcher.start()
|
|
||||||
|
|
||||||
# Configure APRS-IS settings
|
|
||||||
self.mock_conf.aprs_network.enabled = True
|
|
||||||
self.mock_conf.aprs_network.login = 'TEST'
|
|
||||||
self.mock_conf.aprs_network.password = '12345'
|
|
||||||
self.mock_conf.aprs_network.host = 'rotate.aprs.net'
|
|
||||||
self.mock_conf.aprs_network.port = 14580
|
|
||||||
|
|
||||||
# Mock APRS Lib Client
|
|
||||||
self.aprslib_patcher = mock.patch('aprsd.client.drivers.aprsis.APRSLibClient')
|
|
||||||
self.mock_aprslib = self.aprslib_patcher.start()
|
|
||||||
self.mock_client = mock.MagicMock()
|
|
||||||
self.mock_aprslib.return_value = self.mock_client
|
|
||||||
|
|
||||||
# Create an instance of the driver
|
|
||||||
self.driver = APRSISDriver()
|
|
||||||
|
|
||||||
def tearDown(self):
|
|
||||||
self.conf_patcher.stop()
|
|
||||||
self.aprslib_patcher.stop()
|
|
||||||
|
|
||||||
def test_implements_client_driver_protocol(self):
|
|
||||||
"""Test that APRSISDriver implements the ClientDriver Protocol."""
|
|
||||||
# Verify the instance is recognized as implementing the Protocol
|
|
||||||
self.assertIsInstance(self.driver, ClientDriver)
|
|
||||||
|
|
||||||
# Verify all required methods are present with correct signatures
|
|
||||||
required_methods = [
|
|
||||||
'is_enabled',
|
|
||||||
'is_configured',
|
|
||||||
'is_alive',
|
|
||||||
'close',
|
|
||||||
'send',
|
|
||||||
'setup_connection',
|
|
||||||
'set_filter',
|
|
||||||
'login_success',
|
|
||||||
'login_failure',
|
|
||||||
'consumer',
|
|
||||||
'decode_packet',
|
|
||||||
'stats',
|
|
||||||
]
|
|
||||||
|
|
||||||
for method_name in required_methods:
|
|
||||||
self.assertTrue(
|
|
||||||
hasattr(self.driver, method_name),
|
|
||||||
f'Missing required method: {method_name}',
|
|
||||||
)
|
|
||||||
|
|
||||||
def test_init(self):
|
|
||||||
"""Test initialization sets default values."""
|
|
||||||
self.assertIsInstance(self.driver.max_delta, datetime.timedelta)
|
|
||||||
self.assertEqual(self.driver.max_delta, datetime.timedelta(minutes=2))
|
|
||||||
self.assertFalse(self.driver.login_status['success'])
|
|
||||||
self.assertIsNone(self.driver.login_status['message'])
|
|
||||||
self.assertIsNone(self.driver._client)
|
|
||||||
|
|
||||||
def test_is_enabled_true(self):
|
|
||||||
"""Test is_enabled returns True when APRS-IS is enabled."""
|
|
||||||
self.mock_conf.aprs_network.enabled = True
|
|
||||||
self.assertTrue(APRSISDriver.is_enabled())
|
|
||||||
|
|
||||||
def test_is_enabled_false(self):
|
|
||||||
"""Test is_enabled returns False when APRS-IS is disabled."""
|
|
||||||
self.mock_conf.aprs_network.enabled = False
|
|
||||||
self.assertFalse(APRSISDriver.is_enabled())
|
|
||||||
|
|
||||||
def test_is_enabled_key_error(self):
|
|
||||||
"""Test is_enabled returns False when enabled flag doesn't exist."""
|
|
||||||
self.mock_conf.aprs_network = mock.MagicMock()
|
|
||||||
type(self.mock_conf.aprs_network).enabled = mock.PropertyMock(
|
|
||||||
side_effect=KeyError
|
|
||||||
)
|
|
||||||
self.assertFalse(APRSISDriver.is_enabled())
|
|
||||||
|
|
||||||
def test_is_configured_true(self):
|
|
||||||
"""Test is_configured returns True when properly configured."""
|
|
||||||
with mock.patch.object(APRSISDriver, 'is_enabled', return_value=True):
|
|
||||||
self.mock_conf.aprs_network.login = 'TEST'
|
|
||||||
self.mock_conf.aprs_network.password = '12345'
|
|
||||||
self.mock_conf.aprs_network.host = 'rotate.aprs.net'
|
|
||||||
|
|
||||||
self.assertTrue(APRSISDriver.is_configured())
|
|
||||||
|
|
||||||
def test_is_configured_no_login(self):
|
|
||||||
"""Test is_configured raises exception when login not set."""
|
|
||||||
with mock.patch.object(APRSISDriver, 'is_enabled', return_value=True):
|
|
||||||
self.mock_conf.aprs_network.login = None
|
|
||||||
|
|
||||||
with self.assertRaises(exception.MissingConfigOptionException):
|
|
||||||
APRSISDriver.is_configured()
|
|
||||||
|
|
||||||
def test_is_configured_no_password(self):
|
|
||||||
"""Test is_configured raises exception when password not set."""
|
|
||||||
with mock.patch.object(APRSISDriver, 'is_enabled', return_value=True):
|
|
||||||
self.mock_conf.aprs_network.login = 'TEST'
|
|
||||||
self.mock_conf.aprs_network.password = None
|
|
||||||
|
|
||||||
with self.assertRaises(exception.MissingConfigOptionException):
|
|
||||||
APRSISDriver.is_configured()
|
|
||||||
|
|
||||||
def test_is_configured_no_host(self):
|
|
||||||
"""Test is_configured raises exception when host not set."""
|
|
||||||
with mock.patch.object(APRSISDriver, 'is_enabled', return_value=True):
|
|
||||||
self.mock_conf.aprs_network.login = 'TEST'
|
|
||||||
self.mock_conf.aprs_network.password = '12345'
|
|
||||||
self.mock_conf.aprs_network.host = None
|
|
||||||
|
|
||||||
with self.assertRaises(exception.MissingConfigOptionException):
|
|
||||||
APRSISDriver.is_configured()
|
|
||||||
|
|
||||||
def test_is_configured_disabled(self):
|
|
||||||
"""Test is_configured returns True when not enabled."""
|
|
||||||
with mock.patch.object(APRSISDriver, 'is_enabled', return_value=False):
|
|
||||||
self.assertTrue(APRSISDriver.is_configured())
|
|
||||||
|
|
||||||
def test_is_alive_no_client(self):
|
|
||||||
"""Test is_alive returns False when no client."""
|
|
||||||
self.driver._client = None
|
|
||||||
self.assertFalse(self.driver.is_alive)
|
|
||||||
|
|
||||||
def test_is_alive_true(self):
|
|
||||||
"""Test is_alive returns True when client is alive and connection is not stale."""
|
|
||||||
self.driver._client = self.mock_client
|
|
||||||
self.mock_client.is_alive.return_value = True
|
|
||||||
|
|
||||||
with mock.patch.object(self.driver, '_is_stale_connection', return_value=False):
|
|
||||||
self.assertTrue(self.driver.is_alive)
|
|
||||||
|
|
||||||
def test_is_alive_client_not_alive(self):
|
|
||||||
"""Test is_alive returns False when client is not alive."""
|
|
||||||
self.driver._client = self.mock_client
|
|
||||||
self.mock_client.is_alive.return_value = False
|
|
||||||
|
|
||||||
with mock.patch.object(self.driver, '_is_stale_connection', return_value=False):
|
|
||||||
self.assertFalse(self.driver.is_alive)
|
|
||||||
|
|
||||||
def test_is_alive_stale_connection(self):
|
|
||||||
"""Test is_alive returns False when connection is stale."""
|
|
||||||
self.driver._client = self.mock_client
|
|
||||||
self.mock_client.is_alive.return_value = True
|
|
||||||
|
|
||||||
with mock.patch.object(self.driver, '_is_stale_connection', return_value=True):
|
|
||||||
self.assertFalse(self.driver.is_alive)
|
|
||||||
|
|
||||||
def test_close(self):
|
|
||||||
"""Test close method stops and closes the client."""
|
|
||||||
self.driver._client = self.mock_client
|
|
||||||
|
|
||||||
self.driver.close()
|
|
||||||
|
|
||||||
self.mock_client.stop.assert_called_once()
|
|
||||||
self.mock_client.close.assert_called_once()
|
|
||||||
|
|
||||||
def test_close_no_client(self):
|
|
||||||
"""Test close method handles no client gracefully."""
|
|
||||||
self.driver._client = None
|
|
||||||
|
|
||||||
# Should not raise exception
|
|
||||||
self.driver.close()
|
|
||||||
|
|
||||||
def test_send(self):
|
|
||||||
"""Test send passes packet to client."""
|
|
||||||
self.driver._client = self.mock_client
|
|
||||||
mock_packet = mock.MagicMock(spec=core.Packet)
|
|
||||||
|
|
||||||
self.driver.send(mock_packet)
|
|
||||||
|
|
||||||
self.mock_client.send.assert_called_once_with(mock_packet)
|
|
||||||
|
|
||||||
@mock.patch('aprsd.client.drivers.aprsis.LOG')
|
|
||||||
def test_setup_connection_success(self, mock_log):
|
|
||||||
"""Test setup_connection successfully connects."""
|
|
||||||
# Configure successful connection
|
|
||||||
self.mock_client.server_string = 'Test APRS-IS Server'
|
|
||||||
|
|
||||||
self.driver.setup_connection()
|
|
||||||
|
|
||||||
# Check client created with correct parameters
|
|
||||||
self.mock_aprslib.assert_called_once_with(
|
|
||||||
self.mock_conf.aprs_network.login,
|
|
||||||
passwd=self.mock_conf.aprs_network.password,
|
|
||||||
host=self.mock_conf.aprs_network.host,
|
|
||||||
port=self.mock_conf.aprs_network.port,
|
|
||||||
)
|
|
||||||
|
|
||||||
# Check logger set and connection initialized
|
|
||||||
self.assertEqual(self.mock_client.logger, mock_log)
|
|
||||||
self.mock_client.connect.assert_called_once()
|
|
||||||
|
|
||||||
# Check status updated
|
|
||||||
self.assertTrue(self.driver.connected)
|
|
||||||
self.assertTrue(self.driver.login_status['success'])
|
|
||||||
self.assertEqual(self.driver.login_status['message'], 'Test APRS-IS Server')
|
|
||||||
|
|
||||||
@mock.patch('aprsd.client.drivers.aprsis.LOG')
|
|
||||||
@mock.patch('aprsd.client.drivers.aprsis.time.sleep')
|
|
||||||
def test_setup_connection_login_error(self, mock_sleep, mock_log):
|
|
||||||
"""Test setup_connection handles login error."""
|
|
||||||
# Configure login error
|
|
||||||
login_error = LoginError('Bad login')
|
|
||||||
login_error.message = 'Invalid login credentials'
|
|
||||||
self.mock_client.connect.side_effect = login_error
|
|
||||||
|
|
||||||
self.driver.setup_connection()
|
|
||||||
|
|
||||||
# Check error logged
|
|
||||||
mock_log.error.assert_any_call("Failed to login to APRS-IS Server 'Bad login'")
|
|
||||||
mock_log.error.assert_any_call('Invalid login credentials')
|
|
||||||
|
|
||||||
# Check status updated
|
|
||||||
self.assertFalse(self.driver.connected)
|
|
||||||
self.assertFalse(self.driver.login_status['success'])
|
|
||||||
self.assertEqual(
|
|
||||||
self.driver.login_status['message'], 'Invalid login credentials'
|
|
||||||
)
|
|
||||||
|
|
||||||
# Check backoff used
|
|
||||||
mock_sleep.assert_called()
|
|
||||||
|
|
||||||
@mock.patch('aprsd.client.drivers.aprsis.LOG')
|
|
||||||
@mock.patch('aprsd.client.drivers.aprsis.time.sleep')
|
|
||||||
def test_setup_connection_general_error(self, mock_sleep, mock_log):
|
|
||||||
"""Test setup_connection handles general error."""
|
|
||||||
# Configure general exception
|
|
||||||
error_message = 'Connection error'
|
|
||||||
error = Exception(error_message)
|
|
||||||
# Standard exceptions don't have a message attribute
|
|
||||||
self.mock_client.connect.side_effect = error
|
|
||||||
|
|
||||||
self.driver.setup_connection()
|
|
||||||
|
|
||||||
# Check error logged
|
|
||||||
mock_log.error.assert_any_call(
|
|
||||||
f"Unable to connect to APRS-IS server. '{error_message}' "
|
|
||||||
)
|
|
||||||
|
|
||||||
# Check status updated
|
|
||||||
self.assertFalse(self.driver.connected)
|
|
||||||
self.assertFalse(self.driver.login_status['success'])
|
|
||||||
|
|
||||||
# Check login message contains the error message (more flexible than exact equality)
|
|
||||||
self.assertIn(error_message, self.driver.login_status['message'])
|
|
||||||
|
|
||||||
# Check backoff used
|
|
||||||
mock_sleep.assert_called()
|
|
||||||
|
|
||||||
def test_set_filter(self):
|
|
||||||
"""Test set_filter passes filter to client."""
|
|
||||||
self.driver._client = self.mock_client
|
|
||||||
test_filter = 'm/50'
|
|
||||||
|
|
||||||
self.driver.set_filter(test_filter)
|
|
||||||
|
|
||||||
self.mock_client.set_filter.assert_called_once_with(test_filter)
|
|
||||||
|
|
||||||
def test_login_success(self):
|
|
||||||
"""Test login_success returns login status."""
|
|
||||||
self.driver.login_status['success'] = True
|
|
||||||
self.assertTrue(self.driver.login_success())
|
|
||||||
|
|
||||||
self.driver.login_status['success'] = False
|
|
||||||
self.assertFalse(self.driver.login_success())
|
|
||||||
|
|
||||||
def test_login_failure(self):
|
|
||||||
"""Test login_failure returns error message."""
|
|
||||||
self.driver.login_status['message'] = None
|
|
||||||
self.assertIsNone(self.driver.login_failure())
|
|
||||||
|
|
||||||
self.driver.login_status['message'] = 'Test error'
|
|
||||||
self.assertEqual(self.driver.login_failure(), 'Test error')
|
|
||||||
|
|
||||||
def test_filter_property(self):
|
|
||||||
"""Test filter property returns client filter."""
|
|
||||||
self.driver._client = self.mock_client
|
|
||||||
test_filter = 'm/50'
|
|
||||||
self.mock_client.filter = test_filter
|
|
||||||
|
|
||||||
self.assertEqual(self.driver.filter, test_filter)
|
|
||||||
|
|
||||||
def test_server_string_property(self):
|
|
||||||
"""Test server_string property returns client server string."""
|
|
||||||
self.driver._client = self.mock_client
|
|
||||||
test_string = 'Test APRS-IS Server'
|
|
||||||
self.mock_client.server_string = test_string
|
|
||||||
|
|
||||||
self.assertEqual(self.driver.server_string, test_string)
|
|
||||||
|
|
||||||
def test_keepalive_property(self):
|
|
||||||
"""Test keepalive property returns client keepalive."""
|
|
||||||
self.driver._client = self.mock_client
|
|
||||||
test_time = datetime.datetime.now()
|
|
||||||
self.mock_client.aprsd_keepalive = test_time
|
|
||||||
|
|
||||||
self.assertEqual(self.driver.keepalive, test_time)
|
|
||||||
|
|
||||||
@mock.patch('aprsd.client.drivers.aprsis.LOG')
|
|
||||||
def test_is_stale_connection_true(self, mock_log):
|
|
||||||
"""Test _is_stale_connection returns True when connection is stale."""
|
|
||||||
self.driver._client = self.mock_client
|
|
||||||
# Set keepalive to 3 minutes ago (exceeds max_delta of 2 minutes)
|
|
||||||
self.mock_client.aprsd_keepalive = datetime.datetime.now() - datetime.timedelta(
|
|
||||||
minutes=3
|
|
||||||
)
|
|
||||||
|
|
||||||
result = self.driver._is_stale_connection()
|
|
||||||
|
|
||||||
self.assertTrue(result)
|
|
||||||
mock_log.error.assert_called_once()
|
|
||||||
|
|
||||||
def test_is_stale_connection_false(self):
|
|
||||||
"""Test _is_stale_connection returns False when connection is not stale."""
|
|
||||||
self.driver._client = self.mock_client
|
|
||||||
# Set keepalive to 1 minute ago (within max_delta of 2 minutes)
|
|
||||||
self.mock_client.aprsd_keepalive = datetime.datetime.now() - datetime.timedelta(
|
|
||||||
minutes=1
|
|
||||||
)
|
|
||||||
|
|
||||||
result = self.driver._is_stale_connection()
|
|
||||||
|
|
||||||
self.assertFalse(result)
|
|
||||||
|
|
||||||
def test_transport(self):
|
|
||||||
"""Test transport returns appropriate transport type."""
|
|
||||||
self.assertEqual(APRSISDriver.transport(), 'aprsis')
|
|
||||||
|
|
||||||
def test_decode_packet(self):
|
|
||||||
"""Test decode_packet uses core.factory."""
|
|
||||||
with mock.patch('aprsd.client.drivers.aprsis.core.factory') as mock_factory:
|
|
||||||
raw_packet = {'from': 'TEST', 'to': 'APRS'}
|
|
||||||
self.driver.decode_packet(raw_packet)
|
|
||||||
mock_factory.assert_called_once_with(raw_packet)
|
|
||||||
|
|
||||||
@mock.patch('aprsd.client.drivers.aprsis.LOG')
|
|
||||||
def test_consumer_success(self, mock_log):
|
|
||||||
"""Test consumer forwards callback to client."""
|
|
||||||
self.driver._client = self.mock_client
|
|
||||||
mock_callback = mock.MagicMock()
|
|
||||||
|
|
||||||
self.driver.consumer(mock_callback, raw=True)
|
|
||||||
|
|
||||||
self.mock_client.consumer.assert_called_once_with(
|
|
||||||
mock_callback, blocking=False, immortal=False, raw=True
|
|
||||||
)
|
|
||||||
|
|
||||||
@mock.patch('aprsd.client.drivers.aprsis.LOG')
|
|
||||||
def test_consumer_exception(self, mock_log):
|
|
||||||
"""Test consumer handles exceptions."""
|
|
||||||
self.driver._client = self.mock_client
|
|
||||||
mock_callback = mock.MagicMock()
|
|
||||||
test_error = Exception('Test error')
|
|
||||||
self.mock_client.consumer.side_effect = test_error
|
|
||||||
|
|
||||||
with self.assertRaises(Exception): # noqa: B017
|
|
||||||
self.driver.consumer(mock_callback)
|
|
||||||
|
|
||||||
mock_log.error.assert_called_with(test_error)
|
|
||||||
|
|
||||||
@mock.patch('aprsd.client.drivers.aprsis.LOG')
|
|
||||||
def test_consumer_no_client(self, mock_log):
|
|
||||||
"""Test consumer handles no client gracefully."""
|
|
||||||
self.driver._client = None
|
|
||||||
mock_callback = mock.MagicMock()
|
|
||||||
|
|
||||||
self.driver.consumer(mock_callback)
|
|
||||||
|
|
||||||
mock_log.warning.assert_called_once()
|
|
||||||
self.assertFalse(self.driver.connected)
|
|
||||||
|
|
||||||
def test_stats_configured_with_client(self):
|
|
||||||
"""Test stats returns correct data when configured with client."""
|
|
||||||
# Configure driver
|
|
||||||
with mock.patch.object(self.driver, 'is_configured', return_value=True):
|
|
||||||
self.driver._client = self.mock_client
|
|
||||||
self.mock_client.aprsd_keepalive = datetime.datetime.now()
|
|
||||||
self.mock_client.server_string = 'Test Server'
|
|
||||||
self.mock_client.filter = 'm/50'
|
|
||||||
|
|
||||||
stats = self.driver.stats()
|
|
||||||
|
|
||||||
self.assertEqual(stats['connected'], True)
|
|
||||||
self.assertEqual(stats['filter'], 'm/50')
|
|
||||||
self.assertEqual(stats['server_string'], 'Test Server')
|
|
||||||
self.assertEqual(stats['transport'], 'aprsis')
|
|
||||||
|
|
||||||
def test_stats_serializable(self):
|
|
||||||
"""Test stats with serializable=True converts datetime to ISO format."""
|
|
||||||
# Configure driver
|
|
||||||
with mock.patch.object(self.driver, 'is_configured', return_value=True):
|
|
||||||
self.driver._client = self.mock_client
|
|
||||||
test_time = datetime.datetime.now()
|
|
||||||
self.mock_client.aprsd_keepalive = test_time
|
|
||||||
|
|
||||||
stats = self.driver.stats(serializable=True)
|
|
||||||
|
|
||||||
# Check keepalive is a string in ISO format
|
|
||||||
self.assertIsInstance(stats['connection_keepalive'], str)
|
|
||||||
# Try parsing it to verify it's a valid ISO format
|
|
||||||
try:
|
|
||||||
datetime.datetime.fromisoformat(stats['connection_keepalive'])
|
|
||||||
except ValueError:
|
|
||||||
self.fail('keepalive is not in valid ISO format')
|
|
||||||
|
|
||||||
def test_stats_no_client(self):
|
|
||||||
"""Test stats with no client."""
|
|
||||||
with mock.patch.object(self.driver, 'is_configured', return_value=True):
|
|
||||||
self.driver._client = None
|
|
||||||
|
|
||||||
stats = self.driver.stats()
|
|
||||||
|
|
||||||
self.assertEqual(stats['connection_keepalive'], 'None')
|
|
||||||
self.assertEqual(stats['server_string'], 'None')
|
|
||||||
|
|
||||||
def test_stats_not_configured(self):
|
|
||||||
"""Test stats when not configured returns empty dict."""
|
|
||||||
with mock.patch.object(self.driver, 'is_configured', return_value=False):
|
|
||||||
stats = self.driver.stats()
|
|
||||||
self.assertEqual(stats, {})
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == '__main__':
|
|
||||||
unittest.main()
|
|
@ -1,191 +0,0 @@
|
|||||||
import unittest
|
|
||||||
from unittest import mock
|
|
||||||
|
|
||||||
from aprsd.client.drivers.fake import APRSDFakeDriver
|
|
||||||
from aprsd.packets import core
|
|
||||||
|
|
||||||
|
|
||||||
class TestAPRSDFakeDriver(unittest.TestCase):
|
|
||||||
"""Unit tests for the APRSDFakeDriver class."""
|
|
||||||
|
|
||||||
def setUp(self):
|
|
||||||
# Mock CONF for testing
|
|
||||||
self.conf_patcher = mock.patch('aprsd.client.drivers.fake.CONF')
|
|
||||||
self.mock_conf = self.conf_patcher.start()
|
|
||||||
|
|
||||||
# Configure fake_client.enabled
|
|
||||||
self.mock_conf.fake_client.enabled = True
|
|
||||||
|
|
||||||
# Create an instance of the driver
|
|
||||||
self.driver = APRSDFakeDriver()
|
|
||||||
|
|
||||||
def tearDown(self):
|
|
||||||
self.conf_patcher.stop()
|
|
||||||
|
|
||||||
def test_init(self):
|
|
||||||
"""Test initialization sets default values."""
|
|
||||||
self.assertEqual(self.driver.path, ['WIDE1-1', 'WIDE2-1'])
|
|
||||||
self.assertFalse(self.driver.thread_stop)
|
|
||||||
|
|
||||||
def test_is_enabled_true(self):
|
|
||||||
"""Test is_enabled returns True when configured."""
|
|
||||||
self.mock_conf.fake_client.enabled = True
|
|
||||||
self.assertTrue(APRSDFakeDriver.is_enabled())
|
|
||||||
|
|
||||||
def test_is_enabled_false(self):
|
|
||||||
"""Test is_enabled returns False when not configured."""
|
|
||||||
self.mock_conf.fake_client.enabled = False
|
|
||||||
self.assertFalse(APRSDFakeDriver.is_enabled())
|
|
||||||
|
|
||||||
def test_is_alive(self):
|
|
||||||
"""Test is_alive returns True when thread_stop is False."""
|
|
||||||
self.driver.thread_stop = False
|
|
||||||
self.assertTrue(self.driver.is_alive())
|
|
||||||
|
|
||||||
self.driver.thread_stop = True
|
|
||||||
self.assertFalse(self.driver.is_alive())
|
|
||||||
|
|
||||||
def test_close(self):
|
|
||||||
"""Test close sets thread_stop to True."""
|
|
||||||
self.driver.thread_stop = False
|
|
||||||
self.driver.close()
|
|
||||||
self.assertTrue(self.driver.thread_stop)
|
|
||||||
|
|
||||||
@mock.patch('aprsd.client.drivers.fake.LOG')
|
|
||||||
def test_setup_connection(self, mock_log):
|
|
||||||
"""Test setup_connection does nothing (it's fake)."""
|
|
||||||
self.driver.setup_connection()
|
|
||||||
# Method doesn't do anything, so just verify it doesn't crash
|
|
||||||
|
|
||||||
def test_set_filter(self):
|
|
||||||
"""Test set_filter method does nothing (it's fake)."""
|
|
||||||
# Just test it doesn't fail
|
|
||||||
self.driver.set_filter('test/filter')
|
|
||||||
|
|
||||||
def test_login_success(self):
|
|
||||||
"""Test login_success always returns True."""
|
|
||||||
self.assertTrue(self.driver.login_success())
|
|
||||||
|
|
||||||
def test_login_failure(self):
|
|
||||||
"""Test login_failure always returns None."""
|
|
||||||
self.assertIsNone(self.driver.login_failure())
|
|
||||||
|
|
||||||
@mock.patch('aprsd.client.drivers.fake.LOG')
|
|
||||||
def test_send_with_packet_object(self, mock_log):
|
|
||||||
"""Test send with a Packet object."""
|
|
||||||
mock_packet = mock.MagicMock(spec=core.Packet)
|
|
||||||
mock_packet.payload = 'Test payload'
|
|
||||||
mock_packet.to_call = 'TEST'
|
|
||||||
mock_packet.from_call = 'FAKE'
|
|
||||||
|
|
||||||
self.driver.send(mock_packet)
|
|
||||||
|
|
||||||
mock_log.info.assert_called_once()
|
|
||||||
mock_packet.prepare.assert_called_once()
|
|
||||||
|
|
||||||
@mock.patch('aprsd.client.drivers.fake.LOG')
|
|
||||||
def test_send_with_non_packet_object(self, mock_log):
|
|
||||||
"""Test send with a non-Packet object."""
|
|
||||||
# Create a mock message-like object
|
|
||||||
mock_msg = mock.MagicMock()
|
|
||||||
mock_msg.raw = 'Test'
|
|
||||||
mock_msg.msgNo = '123'
|
|
||||||
mock_msg.to_call = 'TEST'
|
|
||||||
mock_msg.from_call = 'FAKE'
|
|
||||||
|
|
||||||
self.driver.send(mock_msg)
|
|
||||||
|
|
||||||
mock_log.info.assert_called_once()
|
|
||||||
mock_log.debug.assert_called_once()
|
|
||||||
|
|
||||||
@mock.patch('aprsd.client.drivers.fake.LOG')
|
|
||||||
@mock.patch('aprsd.client.drivers.fake.time.sleep')
|
|
||||||
def test_consumer_with_raw_true(self, mock_sleep, mock_log):
|
|
||||||
"""Test consumer with raw=True."""
|
|
||||||
mock_callback = mock.MagicMock()
|
|
||||||
|
|
||||||
self.driver.consumer(mock_callback, raw=True)
|
|
||||||
|
|
||||||
# Verify callback was called with raw data
|
|
||||||
mock_callback.assert_called_once()
|
|
||||||
call_args = mock_callback.call_args[1]
|
|
||||||
self.assertIn('raw', call_args)
|
|
||||||
mock_sleep.assert_called_once_with(1)
|
|
||||||
|
|
||||||
@mock.patch('aprsd.client.drivers.fake.LOG')
|
|
||||||
@mock.patch('aprsd.client.drivers.fake.aprslib.parse')
|
|
||||||
@mock.patch('aprsd.client.drivers.fake.core.factory')
|
|
||||||
@mock.patch('aprsd.client.drivers.fake.time.sleep')
|
|
||||||
def test_consumer_with_raw_false(
|
|
||||||
self, mock_sleep, mock_factory, mock_parse, mock_log
|
|
||||||
):
|
|
||||||
"""Test consumer with raw=False."""
|
|
||||||
mock_callback = mock.MagicMock()
|
|
||||||
mock_packet = mock.MagicMock(spec=core.Packet)
|
|
||||||
mock_factory.return_value = mock_packet
|
|
||||||
|
|
||||||
self.driver.consumer(mock_callback, raw=False)
|
|
||||||
|
|
||||||
# Verify the packet was created and passed to callback
|
|
||||||
mock_parse.assert_called_once()
|
|
||||||
mock_factory.assert_called_once()
|
|
||||||
mock_callback.assert_called_once_with(packet=mock_packet)
|
|
||||||
mock_sleep.assert_called_once_with(1)
|
|
||||||
|
|
||||||
def test_consumer_updates_keepalive(self):
|
|
||||||
"""Test consumer updates keepalive timestamp."""
|
|
||||||
mock_callback = mock.MagicMock()
|
|
||||||
old_keepalive = self.driver.aprsd_keepalive
|
|
||||||
|
|
||||||
# Force a small delay to ensure timestamp changes
|
|
||||||
import time
|
|
||||||
|
|
||||||
time.sleep(0.01)
|
|
||||||
|
|
||||||
with mock.patch('aprsd.client.drivers.fake.time.sleep'):
|
|
||||||
self.driver.consumer(mock_callback)
|
|
||||||
|
|
||||||
self.assertNotEqual(old_keepalive, self.driver.aprsd_keepalive)
|
|
||||||
self.assertGreater(self.driver.aprsd_keepalive, old_keepalive)
|
|
||||||
|
|
||||||
def test_decode_packet_with_empty_kwargs(self):
|
|
||||||
"""Test decode_packet with empty kwargs."""
|
|
||||||
result = self.driver.decode_packet()
|
|
||||||
self.assertIsNone(result)
|
|
||||||
|
|
||||||
def test_decode_packet_with_packet(self):
|
|
||||||
"""Test decode_packet with packet in kwargs."""
|
|
||||||
mock_packet = mock.MagicMock(spec=core.Packet)
|
|
||||||
result = self.driver.decode_packet(packet=mock_packet)
|
|
||||||
self.assertEqual(result, mock_packet)
|
|
||||||
|
|
||||||
@mock.patch('aprsd.client.drivers.fake.aprslib.parse')
|
|
||||||
@mock.patch('aprsd.client.drivers.fake.core.factory')
|
|
||||||
def test_decode_packet_with_raw(self, mock_factory, mock_parse):
|
|
||||||
"""Test decode_packet with raw in kwargs."""
|
|
||||||
mock_packet = mock.MagicMock(spec=core.Packet)
|
|
||||||
mock_factory.return_value = mock_packet
|
|
||||||
raw_data = 'raw packet data'
|
|
||||||
|
|
||||||
result = self.driver.decode_packet(raw=raw_data)
|
|
||||||
|
|
||||||
mock_parse.assert_called_once_with(raw_data)
|
|
||||||
mock_factory.assert_called_once_with(mock_parse.return_value)
|
|
||||||
self.assertEqual(result, mock_packet)
|
|
||||||
|
|
||||||
def test_stats(self):
|
|
||||||
"""Test stats returns correct information."""
|
|
||||||
self.driver.thread_stop = False
|
|
||||||
result = self.driver.stats()
|
|
||||||
|
|
||||||
self.assertEqual(result['driver'], 'APRSDFakeDriver')
|
|
||||||
self.assertTrue(result['is_alive'])
|
|
||||||
|
|
||||||
# Test with serializable parameter
|
|
||||||
result_serializable = self.driver.stats(serializable=True)
|
|
||||||
self.assertEqual(result_serializable['driver'], 'APRSDFakeDriver')
|
|
||||||
self.assertTrue(result_serializable['is_alive'])
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == '__main__':
|
|
||||||
unittest.main()
|
|
@ -1,498 +0,0 @@
|
|||||||
import datetime
|
|
||||||
import socket
|
|
||||||
import unittest
|
|
||||||
from unittest import mock
|
|
||||||
|
|
||||||
import aprslib
|
|
||||||
|
|
||||||
from aprsd import exception
|
|
||||||
from aprsd.client.drivers.registry import ClientDriver
|
|
||||||
from aprsd.client.drivers.tcpkiss import TCPKISSDriver
|
|
||||||
from aprsd.packets import core
|
|
||||||
|
|
||||||
|
|
||||||
class TestTCPKISSDriver(unittest.TestCase):
|
|
||||||
"""Unit tests for the TCPKISSDriver class."""
|
|
||||||
|
|
||||||
def setUp(self):
|
|
||||||
# Mock configuration
|
|
||||||
self.conf_patcher = mock.patch('aprsd.client.drivers.tcpkiss.CONF')
|
|
||||||
self.mock_conf = self.conf_patcher.start()
|
|
||||||
|
|
||||||
# Configure KISS settings
|
|
||||||
self.mock_conf.kiss_tcp.enabled = True
|
|
||||||
self.mock_conf.kiss_tcp.host = '127.0.0.1'
|
|
||||||
self.mock_conf.kiss_tcp.port = 8001
|
|
||||||
self.mock_conf.kiss_tcp.path = ['WIDE1-1', 'WIDE2-1']
|
|
||||||
|
|
||||||
# Mock socket
|
|
||||||
self.socket_patcher = mock.patch('aprsd.client.drivers.tcpkiss.socket')
|
|
||||||
self.mock_socket_module = self.socket_patcher.start()
|
|
||||||
self.mock_socket = mock.MagicMock()
|
|
||||||
self.mock_socket_module.socket.return_value = self.mock_socket
|
|
||||||
|
|
||||||
# Mock select
|
|
||||||
self.select_patcher = mock.patch('aprsd.client.drivers.tcpkiss.select')
|
|
||||||
self.mock_select = self.select_patcher.start()
|
|
||||||
|
|
||||||
# Create an instance of the driver
|
|
||||||
self.driver = TCPKISSDriver()
|
|
||||||
|
|
||||||
def tearDown(self):
|
|
||||||
self.conf_patcher.stop()
|
|
||||||
self.socket_patcher.stop()
|
|
||||||
self.select_patcher.stop()
|
|
||||||
|
|
||||||
def test_implements_client_driver_protocol(self):
|
|
||||||
"""Test that TCPKISSDriver implements the ClientDriver Protocol."""
|
|
||||||
# Verify the instance is recognized as implementing the Protocol
|
|
||||||
self.assertIsInstance(self.driver, ClientDriver)
|
|
||||||
|
|
||||||
# Verify all required methods are present with correct signatures
|
|
||||||
required_methods = [
|
|
||||||
'is_enabled',
|
|
||||||
'is_configured',
|
|
||||||
'is_alive',
|
|
||||||
'close',
|
|
||||||
'send',
|
|
||||||
'setup_connection',
|
|
||||||
'set_filter',
|
|
||||||
'login_success',
|
|
||||||
'login_failure',
|
|
||||||
'consumer',
|
|
||||||
'decode_packet',
|
|
||||||
'stats',
|
|
||||||
]
|
|
||||||
|
|
||||||
for method_name in required_methods:
|
|
||||||
self.assertTrue(
|
|
||||||
hasattr(self.driver, method_name),
|
|
||||||
f'Missing required method: {method_name}',
|
|
||||||
)
|
|
||||||
|
|
||||||
def test_init(self):
|
|
||||||
"""Test initialization sets default values."""
|
|
||||||
self.assertFalse(self.driver._connected)
|
|
||||||
self.assertIsInstance(self.driver.keepalive, datetime.datetime)
|
|
||||||
self.assertFalse(self.driver._running)
|
|
||||||
|
|
||||||
def test_transport_property(self):
|
|
||||||
"""Test transport property returns correct value."""
|
|
||||||
self.assertEqual(self.driver.transport, 'tcpkiss')
|
|
||||||
|
|
||||||
def test_is_enabled_true(self):
|
|
||||||
"""Test is_enabled returns True when KISS TCP is enabled."""
|
|
||||||
self.mock_conf.kiss_tcp.enabled = True
|
|
||||||
self.assertTrue(TCPKISSDriver.is_enabled())
|
|
||||||
|
|
||||||
def test_is_enabled_false(self):
|
|
||||||
"""Test is_enabled returns False when KISS TCP is disabled."""
|
|
||||||
self.mock_conf.kiss_tcp.enabled = False
|
|
||||||
self.assertFalse(TCPKISSDriver.is_enabled())
|
|
||||||
|
|
||||||
def test_is_configured_true(self):
|
|
||||||
"""Test is_configured returns True when properly configured."""
|
|
||||||
with mock.patch.object(TCPKISSDriver, 'is_enabled', return_value=True):
|
|
||||||
self.mock_conf.kiss_tcp.host = '127.0.0.1'
|
|
||||||
self.assertTrue(TCPKISSDriver.is_configured())
|
|
||||||
|
|
||||||
def test_is_configured_false_no_host(self):
|
|
||||||
"""Test is_configured returns False when host not set."""
|
|
||||||
with mock.patch.object(TCPKISSDriver, 'is_enabled', return_value=True):
|
|
||||||
self.mock_conf.kiss_tcp.host = None
|
|
||||||
with self.assertRaises(exception.MissingConfigOptionException):
|
|
||||||
TCPKISSDriver.is_configured()
|
|
||||||
|
|
||||||
def test_is_configured_false_not_enabled(self):
|
|
||||||
"""Test is_configured returns False when not enabled."""
|
|
||||||
with mock.patch.object(TCPKISSDriver, 'is_enabled', return_value=False):
|
|
||||||
self.assertFalse(TCPKISSDriver.is_configured())
|
|
||||||
|
|
||||||
def test_is_alive(self):
|
|
||||||
"""Test is_alive property returns connection state."""
|
|
||||||
self.driver._connected = True
|
|
||||||
self.assertTrue(self.driver.is_alive)
|
|
||||||
|
|
||||||
self.driver._connected = False
|
|
||||||
self.assertFalse(self.driver.is_alive)
|
|
||||||
|
|
||||||
def test_close(self):
|
|
||||||
"""Test close method calls stop."""
|
|
||||||
with mock.patch.object(self.driver, 'stop') as mock_stop:
|
|
||||||
self.driver.close()
|
|
||||||
mock_stop.assert_called_once()
|
|
||||||
|
|
||||||
@mock.patch('aprsd.client.drivers.tcpkiss.LOG')
|
|
||||||
def test_setup_connection_success(self, mock_log):
|
|
||||||
"""Test setup_connection successfully connects."""
|
|
||||||
# Mock the connect method to succeed
|
|
||||||
is_en = self.driver.is_enabled
|
|
||||||
is_con = self.driver.is_configured
|
|
||||||
self.driver.is_enabled = mock.MagicMock(return_value=True)
|
|
||||||
self.driver.is_configured = mock.MagicMock(return_value=True)
|
|
||||||
with mock.patch.object(
|
|
||||||
self.driver, 'connect', return_value=True
|
|
||||||
) as mock_connect:
|
|
||||||
self.driver.setup_connection()
|
|
||||||
mock_connect.assert_called_once()
|
|
||||||
mock_log.info.assert_called_with('KISS TCP Connection to 127.0.0.1:8001')
|
|
||||||
|
|
||||||
self.driver.is_enabled = is_en
|
|
||||||
self.driver.is_configured = is_con
|
|
||||||
|
|
||||||
@mock.patch('aprsd.client.drivers.tcpkiss.LOG')
|
|
||||||
def test_setup_connection_failure(self, mock_log):
|
|
||||||
"""Test setup_connection handles connection failure."""
|
|
||||||
# Mock the connect method to fail
|
|
||||||
with mock.patch.object(
|
|
||||||
self.driver, 'connect', return_value=False
|
|
||||||
) as mock_connect:
|
|
||||||
self.driver.setup_connection()
|
|
||||||
mock_connect.assert_called_once()
|
|
||||||
mock_log.error.assert_called_with('Failed to connect to KISS interface')
|
|
||||||
|
|
||||||
@mock.patch('aprsd.client.drivers.tcpkiss.LOG')
|
|
||||||
def test_setup_connection_exception(self, mock_log):
|
|
||||||
"""Test setup_connection handles exceptions."""
|
|
||||||
# Mock the connect method to raise an exception
|
|
||||||
with mock.patch.object(
|
|
||||||
self.driver, 'connect', side_effect=Exception('Test error')
|
|
||||||
) as mock_connect:
|
|
||||||
self.driver.setup_connection()
|
|
||||||
mock_connect.assert_called_once()
|
|
||||||
mock_log.error.assert_any_call('Failed to initialize KISS interface')
|
|
||||||
mock_log.exception.assert_called_once()
|
|
||||||
self.assertFalse(self.driver._connected)
|
|
||||||
|
|
||||||
def test_set_filter(self):
|
|
||||||
"""Test set_filter does nothing for KISS."""
|
|
||||||
# Just ensure it doesn't fail
|
|
||||||
self.driver.set_filter('test/filter')
|
|
||||||
|
|
||||||
def test_login_success_when_connected(self):
|
|
||||||
"""Test login_success returns True when connected."""
|
|
||||||
self.driver._connected = True
|
|
||||||
self.assertTrue(self.driver.login_success())
|
|
||||||
|
|
||||||
def test_login_success_when_not_connected(self):
|
|
||||||
"""Test login_success returns False when not connected."""
|
|
||||||
self.driver._connected = False
|
|
||||||
self.assertFalse(self.driver.login_success())
|
|
||||||
|
|
||||||
def test_login_failure(self):
|
|
||||||
"""Test login_failure returns success message."""
|
|
||||||
self.assertEqual(self.driver.login_failure(), 'Login successful')
|
|
||||||
|
|
||||||
@mock.patch('aprsd.client.drivers.tcpkiss.ax25frame.Frame.ui')
|
|
||||||
def test_send_packet(self, mock_frame_ui):
|
|
||||||
"""Test sending an APRS packet."""
|
|
||||||
# Create a mock frame
|
|
||||||
mock_frame = mock.MagicMock()
|
|
||||||
mock_frame_bytes = b'mock_frame_data'
|
|
||||||
mock_frame.__bytes__ = mock.MagicMock(return_value=mock_frame_bytes)
|
|
||||||
mock_frame_ui.return_value = mock_frame
|
|
||||||
|
|
||||||
# Set up the driver
|
|
||||||
self.driver.socket = self.mock_socket
|
|
||||||
self.driver.path = ['WIDE1-1', 'WIDE2-1']
|
|
||||||
|
|
||||||
# Create a mock packet
|
|
||||||
mock_packet = mock.MagicMock(spec=core.Packet)
|
|
||||||
mock_bytes = b'Test packet data'
|
|
||||||
mock_packet.__bytes__ = mock.MagicMock(return_value=mock_bytes)
|
|
||||||
# Add path attribute to the mock packet
|
|
||||||
mock_packet.path = None
|
|
||||||
|
|
||||||
# Send the packet
|
|
||||||
self.driver.send(mock_packet)
|
|
||||||
|
|
||||||
# Check that frame was created correctly
|
|
||||||
mock_frame_ui.assert_called_once_with(
|
|
||||||
destination='APZ100',
|
|
||||||
source=mock_packet.from_call,
|
|
||||||
path=self.driver.path,
|
|
||||||
info=mock_packet.payload.encode('US-ASCII'),
|
|
||||||
)
|
|
||||||
|
|
||||||
# Check that socket send was called
|
|
||||||
self.mock_socket.send.assert_called_once()
|
|
||||||
|
|
||||||
# Verify packet counters updated
|
|
||||||
self.assertEqual(self.driver.packets_sent, 1)
|
|
||||||
self.assertIsNotNone(self.driver.last_packet_sent)
|
|
||||||
|
|
||||||
def test_send_with_no_socket(self):
|
|
||||||
"""Test send raises exception when socket not initialized."""
|
|
||||||
self.driver.socket = None
|
|
||||||
mock_packet = mock.MagicMock(spec=core.Packet)
|
|
||||||
|
|
||||||
with self.assertRaises(Exception) as context:
|
|
||||||
self.driver.send(mock_packet)
|
|
||||||
self.assertIn('KISS interface not initialized', str(context.exception))
|
|
||||||
|
|
||||||
def test_stop(self):
|
|
||||||
"""Test stop method cleans up properly."""
|
|
||||||
self.driver._running = True
|
|
||||||
self.driver._connected = True
|
|
||||||
self.driver.socket = self.mock_socket
|
|
||||||
|
|
||||||
self.driver.stop()
|
|
||||||
|
|
||||||
self.assertFalse(self.driver._running)
|
|
||||||
self.assertFalse(self.driver._connected)
|
|
||||||
self.mock_socket.close.assert_called_once()
|
|
||||||
|
|
||||||
def test_stats(self):
|
|
||||||
"""Test stats method returns correct data."""
|
|
||||||
# Set up test data
|
|
||||||
self.driver._connected = True
|
|
||||||
self.driver.path = ['WIDE1-1', 'WIDE2-1']
|
|
||||||
self.driver.packets_sent = 5
|
|
||||||
self.driver.packets_received = 3
|
|
||||||
self.driver.last_packet_sent = datetime.datetime.now()
|
|
||||||
self.driver.last_packet_received = datetime.datetime.now()
|
|
||||||
|
|
||||||
# Get stats
|
|
||||||
stats = self.driver.stats()
|
|
||||||
|
|
||||||
# Check stats contains expected keys
|
|
||||||
expected_keys = [
|
|
||||||
'client',
|
|
||||||
'transport',
|
|
||||||
'connected',
|
|
||||||
'path',
|
|
||||||
'packets_sent',
|
|
||||||
'packets_received',
|
|
||||||
'last_packet_sent',
|
|
||||||
'last_packet_received',
|
|
||||||
'connection_keepalive',
|
|
||||||
'host',
|
|
||||||
'port',
|
|
||||||
]
|
|
||||||
for key in expected_keys:
|
|
||||||
self.assertIn(key, stats)
|
|
||||||
|
|
||||||
# Check some specific values
|
|
||||||
self.assertEqual(stats['client'], 'TCPKISSDriver')
|
|
||||||
self.assertEqual(stats['transport'], 'tcpkiss')
|
|
||||||
self.assertEqual(stats['connected'], True)
|
|
||||||
self.assertEqual(stats['packets_sent'], 5)
|
|
||||||
self.assertEqual(stats['packets_received'], 3)
|
|
||||||
|
|
||||||
def test_stats_serializable(self):
|
|
||||||
"""Test stats with serializable=True converts datetime to ISO format."""
|
|
||||||
self.driver.keepalive = datetime.datetime.now()
|
|
||||||
|
|
||||||
stats = self.driver.stats(serializable=True)
|
|
||||||
|
|
||||||
# Check keepalive is a string in ISO format
|
|
||||||
self.assertIsInstance(stats['connection_keepalive'], str)
|
|
||||||
# Try parsing it to verify it's a valid ISO format
|
|
||||||
try:
|
|
||||||
datetime.datetime.fromisoformat(stats['connection_keepalive'])
|
|
||||||
except ValueError:
|
|
||||||
self.fail('keepalive is not in valid ISO format')
|
|
||||||
|
|
||||||
def test_connect_success(self):
|
|
||||||
"""Test successful connection."""
|
|
||||||
result = self.driver.connect()
|
|
||||||
|
|
||||||
self.assertTrue(result)
|
|
||||||
self.assertTrue(self.driver._connected)
|
|
||||||
self.mock_socket.connect.assert_called_once_with(
|
|
||||||
(self.mock_conf.kiss_tcp.host, self.mock_conf.kiss_tcp.port)
|
|
||||||
)
|
|
||||||
self.mock_socket.settimeout.assert_any_call(5.0)
|
|
||||||
self.mock_socket.settimeout.assert_any_call(0.1)
|
|
||||||
|
|
||||||
def test_connect_failure_socket_error(self):
|
|
||||||
"""Test connection failure due to socket error."""
|
|
||||||
self.mock_socket.connect.side_effect = socket.error('Test socket error')
|
|
||||||
|
|
||||||
result = self.driver.connect()
|
|
||||||
|
|
||||||
self.assertFalse(result)
|
|
||||||
self.assertFalse(self.driver._connected)
|
|
||||||
|
|
||||||
def test_connect_failure_timeout(self):
|
|
||||||
"""Test connection failure due to timeout."""
|
|
||||||
self.mock_socket.connect.side_effect = socket.timeout('Test timeout')
|
|
||||||
|
|
||||||
result = self.driver.connect()
|
|
||||||
|
|
||||||
self.assertFalse(result)
|
|
||||||
self.assertFalse(self.driver._connected)
|
|
||||||
|
|
||||||
def test_fix_raw_frame(self):
|
|
||||||
"""Test fix_raw_frame removes KISS markers and handles FEND."""
|
|
||||||
# Create a test frame with KISS markers
|
|
||||||
with mock.patch(
|
|
||||||
'aprsd.client.drivers.tcpkiss.handle_fend', return_value=b'fixed_frame'
|
|
||||||
) as mock_handle_fend:
|
|
||||||
raw_frame = b'\xc0\x00some_frame_data\xc0' # \xc0 is FEND
|
|
||||||
|
|
||||||
result = self.driver.fix_raw_frame(raw_frame)
|
|
||||||
|
|
||||||
mock_handle_fend.assert_called_once_with(b'some_frame_data')
|
|
||||||
self.assertEqual(result, b'fixed_frame')
|
|
||||||
|
|
||||||
@mock.patch('aprsd.client.drivers.tcpkiss.LOG')
|
|
||||||
def test_decode_packet_success(self, mock_log):
|
|
||||||
"""Test successful packet decoding."""
|
|
||||||
mock_frame = 'test frame data'
|
|
||||||
mock_aprs_data = {'from': 'TEST-1', 'to': 'APRS'}
|
|
||||||
mock_packet = mock.MagicMock(spec=core.Packet)
|
|
||||||
|
|
||||||
with mock.patch(
|
|
||||||
'aprsd.client.drivers.tcpkiss.aprslib.parse', return_value=mock_aprs_data
|
|
||||||
) as mock_parse:
|
|
||||||
with mock.patch(
|
|
||||||
'aprsd.client.drivers.tcpkiss.core.factory', return_value=mock_packet
|
|
||||||
) as mock_factory:
|
|
||||||
result = self.driver.decode_packet(frame=mock_frame)
|
|
||||||
|
|
||||||
mock_parse.assert_called_once_with(str(mock_frame))
|
|
||||||
mock_factory.assert_called_once_with(mock_aprs_data)
|
|
||||||
self.assertEqual(result, mock_packet)
|
|
||||||
|
|
||||||
@mock.patch('aprsd.client.drivers.tcpkiss.LOG')
|
|
||||||
def test_decode_packet_no_frame(self, mock_log):
|
|
||||||
"""Test decode_packet with no frame returns None."""
|
|
||||||
result = self.driver.decode_packet()
|
|
||||||
|
|
||||||
self.assertIsNone(result)
|
|
||||||
mock_log.warning.assert_called_once()
|
|
||||||
|
|
||||||
@mock.patch('aprsd.client.drivers.tcpkiss.LOG')
|
|
||||||
def test_decode_packet_exception(self, mock_log):
|
|
||||||
"""Test decode_packet handles exceptions."""
|
|
||||||
mock_frame = 'invalid frame'
|
|
||||||
|
|
||||||
with mock.patch(
|
|
||||||
'aprsd.client.drivers.tcpkiss.aprslib.parse',
|
|
||||||
side_effect=Exception('Test error'),
|
|
||||||
) as mock_parse:
|
|
||||||
result = self.driver.decode_packet(frame=mock_frame)
|
|
||||||
|
|
||||||
mock_parse.assert_called_once()
|
|
||||||
self.assertIsNone(result)
|
|
||||||
mock_log.error.assert_called_once()
|
|
||||||
|
|
||||||
@mock.patch('aprsd.client.drivers.tcpkiss.LOG')
|
|
||||||
def test_consumer_with_frame(self, mock_log):
|
|
||||||
"""Test consumer processes frames and calls callback."""
|
|
||||||
mock_callback = mock.MagicMock()
|
|
||||||
mock_frame = mock.MagicMock()
|
|
||||||
|
|
||||||
# Configure driver for test
|
|
||||||
self.driver._connected = True
|
|
||||||
self.driver._running = True
|
|
||||||
|
|
||||||
# Set up read_frame to return one frame then stop
|
|
||||||
def side_effect():
|
|
||||||
self.driver._running = False
|
|
||||||
return mock_frame
|
|
||||||
|
|
||||||
with mock.patch.object(
|
|
||||||
self.driver, 'read_frame', side_effect=side_effect
|
|
||||||
) as mock_read_frame:
|
|
||||||
self.driver.consumer(mock_callback)
|
|
||||||
|
|
||||||
mock_read_frame.assert_called_once()
|
|
||||||
mock_callback.assert_called_once_with(frame=mock_frame)
|
|
||||||
|
|
||||||
@mock.patch('aprsd.client.drivers.tcpkiss.LOG')
|
|
||||||
def test_consumer_with_connect_reconnect(self, mock_log):
|
|
||||||
"""Test consumer tries to reconnect when not connected."""
|
|
||||||
mock_callback = mock.MagicMock()
|
|
||||||
|
|
||||||
# Configure driver for test
|
|
||||||
self.driver._connected = False
|
|
||||||
|
|
||||||
# Setup to run once then stop
|
|
||||||
call_count = 0
|
|
||||||
|
|
||||||
def connect_side_effect():
|
|
||||||
nonlocal call_count
|
|
||||||
call_count += 1
|
|
||||||
# On second call, connect successfully
|
|
||||||
if call_count == 2:
|
|
||||||
self.driver._running = False
|
|
||||||
self.driver.socket = self.mock_socket
|
|
||||||
return True
|
|
||||||
return False
|
|
||||||
|
|
||||||
with mock.patch.object(
|
|
||||||
self.driver, 'connect', side_effect=connect_side_effect
|
|
||||||
) as mock_connect:
|
|
||||||
with mock.patch('aprsd.client.drivers.tcpkiss.time.sleep') as mock_sleep:
|
|
||||||
self.driver.consumer(mock_callback)
|
|
||||||
|
|
||||||
self.assertEqual(mock_connect.call_count, 2)
|
|
||||||
mock_sleep.assert_called_once_with(1)
|
|
||||||
|
|
||||||
@mock.patch('aprsd.client.drivers.tcpkiss.LOG')
|
|
||||||
def test_read_frame_success(self, mock_log):
|
|
||||||
"""Test read_frame successfully reads a frame."""
|
|
||||||
# Set up driver
|
|
||||||
self.driver.socket = self.mock_socket
|
|
||||||
self.driver._running = True
|
|
||||||
|
|
||||||
# Mock socket recv to return data
|
|
||||||
raw_data = b'\xc0\x00test_frame\xc0'
|
|
||||||
self.mock_socket.recv.return_value = raw_data
|
|
||||||
|
|
||||||
# Mock select to indicate socket is readable
|
|
||||||
self.mock_select.select.return_value = ([self.mock_socket], [], [])
|
|
||||||
|
|
||||||
# Mock fix_raw_frame and Frame.from_bytes
|
|
||||||
mock_fixed_frame = b'fixed_frame'
|
|
||||||
mock_ax25_frame = mock.MagicMock()
|
|
||||||
|
|
||||||
with mock.patch.object(
|
|
||||||
self.driver, 'fix_raw_frame', return_value=mock_fixed_frame
|
|
||||||
) as mock_fix:
|
|
||||||
with mock.patch(
|
|
||||||
'aprsd.client.drivers.tcpkiss.ax25frame.Frame.from_bytes',
|
|
||||||
return_value=mock_ax25_frame,
|
|
||||||
) as mock_from_bytes:
|
|
||||||
result = self.driver.read_frame()
|
|
||||||
|
|
||||||
self.mock_socket.setblocking.assert_called_once_with(0)
|
|
||||||
self.mock_select.select.assert_called_once()
|
|
||||||
self.mock_socket.recv.assert_called_once()
|
|
||||||
mock_fix.assert_called_once_with(raw_data)
|
|
||||||
mock_from_bytes.assert_called_once_with(mock_fixed_frame)
|
|
||||||
self.assertEqual(result, mock_ax25_frame)
|
|
||||||
|
|
||||||
@mock.patch('aprsd.client.drivers.tcpkiss.LOG')
|
|
||||||
def test_read_frame_select_timeout(self, mock_log):
|
|
||||||
"""Test read_frame handles select timeout."""
|
|
||||||
# Set up driver
|
|
||||||
self.driver.socket = self.mock_socket
|
|
||||||
self.driver._running = True
|
|
||||||
|
|
||||||
# Mock select to indicate no readable sockets
|
|
||||||
self.mock_select.select.return_value = ([], [], [])
|
|
||||||
|
|
||||||
result = self.driver.read_frame()
|
|
||||||
|
|
||||||
self.assertIsNone(result)
|
|
||||||
|
|
||||||
@mock.patch('aprsd.client.drivers.tcpkiss.LOG')
|
|
||||||
def test_read_frame_socket_error(self, mock_log):
|
|
||||||
"""Test read_frame handles socket error."""
|
|
||||||
# Set up driver
|
|
||||||
self.driver.socket = self.mock_socket
|
|
||||||
self.driver._running = True
|
|
||||||
|
|
||||||
# Mock setblocking to raise OSError
|
|
||||||
self.mock_socket.setblocking.side_effect = OSError('Test error')
|
|
||||||
|
|
||||||
with self.assertRaises(aprslib.ConnectionDrop):
|
|
||||||
self.driver.read_frame()
|
|
||||||
mock_log.error.assert_called_once()
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == '__main__':
|
|
||||||
unittest.main()
|
|
89
tests/client/test_aprsis.py
Normal file
89
tests/client/test_aprsis.py
Normal file
@ -0,0 +1,89 @@
|
|||||||
|
import datetime
|
||||||
|
import unittest
|
||||||
|
from unittest import mock
|
||||||
|
|
||||||
|
from aprsd import exception
|
||||||
|
from aprsd.client.aprsis import APRSISClient
|
||||||
|
|
||||||
|
|
||||||
|
class TestAPRSISClient(unittest.TestCase):
|
||||||
|
"""Test cases for APRSISClient."""
|
||||||
|
|
||||||
|
def setUp(self):
|
||||||
|
"""Set up test fixtures."""
|
||||||
|
super().setUp()
|
||||||
|
|
||||||
|
# Mock the config
|
||||||
|
self.mock_conf = mock.MagicMock()
|
||||||
|
self.mock_conf.aprs_network.enabled = True
|
||||||
|
self.mock_conf.aprs_network.login = "TEST"
|
||||||
|
self.mock_conf.aprs_network.password = "12345"
|
||||||
|
self.mock_conf.aprs_network.host = "localhost"
|
||||||
|
self.mock_conf.aprs_network.port = 14580
|
||||||
|
|
||||||
|
@mock.patch("aprsd.client.base.APRSClient")
|
||||||
|
@mock.patch("aprsd.client.drivers.aprsis.Aprsdis")
|
||||||
|
def test_stats_not_configured(self, mock_aprsdis, mock_base):
|
||||||
|
"""Test stats when client is not configured."""
|
||||||
|
mock_client = mock.MagicMock()
|
||||||
|
mock_aprsdis.return_value = mock_client
|
||||||
|
|
||||||
|
with mock.patch("aprsd.client.aprsis.cfg.CONF", self.mock_conf):
|
||||||
|
self.client = APRSISClient()
|
||||||
|
|
||||||
|
with mock.patch.object(APRSISClient, "is_configured", return_value=False):
|
||||||
|
stats = self.client.stats()
|
||||||
|
self.assertEqual({}, stats)
|
||||||
|
|
||||||
|
@mock.patch("aprsd.client.base.APRSClient")
|
||||||
|
@mock.patch("aprsd.client.drivers.aprsis.Aprsdis")
|
||||||
|
def test_stats_configured(self, mock_aprsdis, mock_base):
|
||||||
|
"""Test stats when client is configured."""
|
||||||
|
mock_client = mock.MagicMock()
|
||||||
|
mock_aprsdis.return_value = mock_client
|
||||||
|
|
||||||
|
with mock.patch("aprsd.client.aprsis.cfg.CONF", self.mock_conf):
|
||||||
|
self.client = APRSISClient()
|
||||||
|
|
||||||
|
mock_client = mock.MagicMock()
|
||||||
|
mock_client.server_string = "test.server:14580"
|
||||||
|
mock_client.aprsd_keepalive = datetime.datetime.now()
|
||||||
|
self.client._client = mock_client
|
||||||
|
self.client.filter = "m/50"
|
||||||
|
|
||||||
|
with mock.patch.object(APRSISClient, "is_configured", return_value=True):
|
||||||
|
stats = self.client.stats()
|
||||||
|
from rich.console import Console
|
||||||
|
|
||||||
|
c = Console()
|
||||||
|
c.print(stats)
|
||||||
|
self.assertEqual(
|
||||||
|
{
|
||||||
|
"connected": True,
|
||||||
|
"filter": "m/50",
|
||||||
|
"login_status": {"message": mock.ANY, "success": True},
|
||||||
|
"connection_keepalive": mock_client.aprsd_keepalive,
|
||||||
|
"server_string": mock_client.server_string,
|
||||||
|
"transport": "aprsis",
|
||||||
|
},
|
||||||
|
stats,
|
||||||
|
)
|
||||||
|
|
||||||
|
def test_is_configured_missing_login(self):
|
||||||
|
"""Test is_configured with missing login."""
|
||||||
|
self.mock_conf.aprs_network.login = None
|
||||||
|
with self.assertRaises(exception.MissingConfigOptionException):
|
||||||
|
APRSISClient.is_configured()
|
||||||
|
|
||||||
|
def test_is_configured_missing_password(self):
|
||||||
|
"""Test is_configured with missing password."""
|
||||||
|
self.mock_conf.aprs_network.password = None
|
||||||
|
with self.assertRaises(exception.MissingConfigOptionException):
|
||||||
|
APRSISClient.is_configured()
|
||||||
|
|
||||||
|
def test_is_configured_missing_host(self):
|
||||||
|
"""Test is_configured with missing host."""
|
||||||
|
self.mock_conf.aprs_network.host = None
|
||||||
|
with mock.patch("aprsd.client.aprsis.cfg.CONF", self.mock_conf):
|
||||||
|
with self.assertRaises(exception.MissingConfigOptionException):
|
||||||
|
APRSISClient.is_configured()
|
141
tests/client/test_client_base.py
Normal file
141
tests/client/test_client_base.py
Normal file
@ -0,0 +1,141 @@
|
|||||||
|
import unittest
|
||||||
|
from unittest import mock
|
||||||
|
|
||||||
|
from aprsd.client.base import APRSClient
|
||||||
|
from aprsd.packets import core
|
||||||
|
|
||||||
|
|
||||||
|
class MockAPRSClient(APRSClient):
|
||||||
|
"""Concrete implementation of APRSClient for testing."""
|
||||||
|
|
||||||
|
def stats(self):
|
||||||
|
return {"packets_received": 0, "packets_sent": 0}
|
||||||
|
|
||||||
|
def setup_connection(self):
|
||||||
|
mock_connection = mock.MagicMock()
|
||||||
|
# Configure the mock with required methods
|
||||||
|
mock_connection.close = mock.MagicMock()
|
||||||
|
mock_connection.stop = mock.MagicMock()
|
||||||
|
mock_connection.set_filter = mock.MagicMock()
|
||||||
|
mock_connection.send = mock.MagicMock()
|
||||||
|
self._client = mock_connection
|
||||||
|
return mock_connection
|
||||||
|
|
||||||
|
def decode_packet(self, *args, **kwargs):
|
||||||
|
return mock.MagicMock()
|
||||||
|
|
||||||
|
def consumer(self, callback, blocking=False, immortal=False, raw=False):
|
||||||
|
pass
|
||||||
|
|
||||||
|
def is_alive(self):
|
||||||
|
return True
|
||||||
|
|
||||||
|
def close(self):
|
||||||
|
pass
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def is_enabled():
|
||||||
|
return True
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def transport():
|
||||||
|
return "mock"
|
||||||
|
|
||||||
|
def reset(self):
|
||||||
|
"""Mock implementation of reset."""
|
||||||
|
if self._client:
|
||||||
|
self._client.close()
|
||||||
|
self._client = self.setup_connection()
|
||||||
|
if self.filter:
|
||||||
|
self._client.set_filter(self.filter)
|
||||||
|
|
||||||
|
|
||||||
|
class TestAPRSClient(unittest.TestCase):
|
||||||
|
def setUp(self):
|
||||||
|
# Reset the singleton instance before each test
|
||||||
|
APRSClient._instance = None
|
||||||
|
APRSClient._client = None
|
||||||
|
self.client = MockAPRSClient()
|
||||||
|
|
||||||
|
def test_singleton_pattern(self):
|
||||||
|
"""Test that multiple instantiations return the same instance."""
|
||||||
|
client1 = MockAPRSClient()
|
||||||
|
client2 = MockAPRSClient()
|
||||||
|
self.assertIs(client1, client2)
|
||||||
|
|
||||||
|
def test_set_filter(self):
|
||||||
|
"""Test setting APRS filter."""
|
||||||
|
# Get the existing mock client that was created in __init__
|
||||||
|
mock_client = self.client._client
|
||||||
|
|
||||||
|
test_filter = "m/50"
|
||||||
|
self.client.set_filter(test_filter)
|
||||||
|
self.assertEqual(self.client.filter, test_filter)
|
||||||
|
# The filter is set once during set_filter() and once during reset()
|
||||||
|
mock_client.set_filter.assert_called_with(test_filter)
|
||||||
|
|
||||||
|
@mock.patch("aprsd.client.base.LOG")
|
||||||
|
def test_reset(self, mock_log):
|
||||||
|
"""Test client reset functionality."""
|
||||||
|
# Create a new mock client with the necessary methods
|
||||||
|
old_client = mock.MagicMock()
|
||||||
|
self.client._client = old_client
|
||||||
|
|
||||||
|
self.client.reset()
|
||||||
|
|
||||||
|
# Verify the old client was closed
|
||||||
|
old_client.close.assert_called_once()
|
||||||
|
|
||||||
|
# Verify a new client was created
|
||||||
|
self.assertIsNotNone(self.client._client)
|
||||||
|
self.assertNotEqual(old_client, self.client._client)
|
||||||
|
|
||||||
|
def test_send_packet(self):
|
||||||
|
"""Test sending an APRS packet."""
|
||||||
|
mock_packet = mock.Mock(spec=core.Packet)
|
||||||
|
self.client.send(mock_packet)
|
||||||
|
self.client._client.send.assert_called_once_with(mock_packet)
|
||||||
|
|
||||||
|
def test_stop(self):
|
||||||
|
"""Test stopping the client."""
|
||||||
|
# Ensure client is created first
|
||||||
|
self.client._create_client()
|
||||||
|
|
||||||
|
self.client.stop()
|
||||||
|
self.client._client.stop.assert_called_once()
|
||||||
|
|
||||||
|
@mock.patch("aprsd.client.base.LOG")
|
||||||
|
def test_create_client_failure(self, mock_log):
|
||||||
|
"""Test handling of client creation failure."""
|
||||||
|
# Make setup_connection raise an exception
|
||||||
|
with mock.patch.object(
|
||||||
|
self.client,
|
||||||
|
"setup_connection",
|
||||||
|
side_effect=Exception("Connection failed"),
|
||||||
|
):
|
||||||
|
with self.assertRaises(Exception):
|
||||||
|
self.client._create_client()
|
||||||
|
|
||||||
|
self.assertIsNone(self.client._client)
|
||||||
|
mock_log.error.assert_called_once()
|
||||||
|
|
||||||
|
def test_client_property(self):
|
||||||
|
"""Test the client property creates client if none exists."""
|
||||||
|
self.client._client = None
|
||||||
|
client = self.client.client
|
||||||
|
self.assertIsNotNone(client)
|
||||||
|
|
||||||
|
def test_filter_applied_on_creation(self):
|
||||||
|
"""Test that filter is applied when creating new client."""
|
||||||
|
test_filter = "m/50"
|
||||||
|
self.client.set_filter(test_filter)
|
||||||
|
|
||||||
|
# Force client recreation
|
||||||
|
self.client.reset()
|
||||||
|
|
||||||
|
# Verify filter was applied to new client
|
||||||
|
self.client._client.set_filter.assert_called_with(test_filter)
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
unittest.main()
|
75
tests/client/test_factory.py
Normal file
75
tests/client/test_factory.py
Normal file
@ -0,0 +1,75 @@
|
|||||||
|
import unittest
|
||||||
|
from unittest import mock
|
||||||
|
|
||||||
|
from aprsd.client.factory import Client, ClientFactory
|
||||||
|
|
||||||
|
|
||||||
|
class MockClient:
|
||||||
|
"""Mock client for testing."""
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def is_enabled(cls):
|
||||||
|
return True
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def is_configured(cls):
|
||||||
|
return True
|
||||||
|
|
||||||
|
|
||||||
|
class TestClientFactory(unittest.TestCase):
|
||||||
|
"""Test cases for ClientFactory."""
|
||||||
|
|
||||||
|
def setUp(self):
|
||||||
|
"""Set up test fixtures."""
|
||||||
|
self.factory = ClientFactory()
|
||||||
|
# Clear any registered clients from previous tests
|
||||||
|
self.factory.clients = []
|
||||||
|
|
||||||
|
def test_singleton(self):
|
||||||
|
"""Test that ClientFactory is a singleton."""
|
||||||
|
factory2 = ClientFactory()
|
||||||
|
self.assertEqual(self.factory, factory2)
|
||||||
|
|
||||||
|
def test_register_client(self):
|
||||||
|
"""Test registering a client."""
|
||||||
|
self.factory.register(MockClient)
|
||||||
|
self.assertIn(MockClient, self.factory.clients)
|
||||||
|
|
||||||
|
def test_register_invalid_client(self):
|
||||||
|
"""Test registering an invalid client raises error."""
|
||||||
|
invalid_client = mock.MagicMock(spec=Client)
|
||||||
|
with self.assertRaises(ValueError):
|
||||||
|
self.factory.register(invalid_client)
|
||||||
|
|
||||||
|
def test_create_client(self):
|
||||||
|
"""Test creating a client."""
|
||||||
|
self.factory.register(MockClient)
|
||||||
|
client = self.factory.create()
|
||||||
|
self.assertIsInstance(client, MockClient)
|
||||||
|
|
||||||
|
def test_create_no_clients(self):
|
||||||
|
"""Test creating a client with no registered clients."""
|
||||||
|
with self.assertRaises(Exception):
|
||||||
|
self.factory.create()
|
||||||
|
|
||||||
|
def test_is_client_enabled(self):
|
||||||
|
"""Test checking if any client is enabled."""
|
||||||
|
self.factory.register(MockClient)
|
||||||
|
self.assertTrue(self.factory.is_client_enabled())
|
||||||
|
|
||||||
|
def test_is_client_enabled_none(self):
|
||||||
|
"""Test checking if any client is enabled when none are."""
|
||||||
|
MockClient.is_enabled = classmethod(lambda cls: False)
|
||||||
|
self.factory.register(MockClient)
|
||||||
|
self.assertFalse(self.factory.is_client_enabled())
|
||||||
|
|
||||||
|
def test_is_client_configured(self):
|
||||||
|
"""Test checking if any client is configured."""
|
||||||
|
self.factory.register(MockClient)
|
||||||
|
self.assertTrue(self.factory.is_client_configured())
|
||||||
|
|
||||||
|
def test_is_client_configured_none(self):
|
||||||
|
"""Test checking if any client is configured when none are."""
|
||||||
|
MockClient.is_configured = classmethod(lambda cls: False)
|
||||||
|
self.factory.register(MockClient)
|
||||||
|
self.assertFalse(self.factory.is_client_configured())
|
@ -1,100 +0,0 @@
|
|||||||
import unittest
|
|
||||||
from unittest import mock
|
|
||||||
|
|
||||||
from aprsd.client.drivers.registry import DriverRegistry
|
|
||||||
|
|
||||||
from ..mock_client_driver import MockClientDriver
|
|
||||||
|
|
||||||
|
|
||||||
class TestDriverRegistry(unittest.TestCase):
|
|
||||||
"""Unit tests for the DriverRegistry class."""
|
|
||||||
|
|
||||||
def setUp(self):
|
|
||||||
# Reset the singleton instance before each test
|
|
||||||
DriverRegistry._singleton_instances = {}
|
|
||||||
self.registry = DriverRegistry()
|
|
||||||
self.registry.drivers = []
|
|
||||||
|
|
||||||
# Mock APRSISDriver completely
|
|
||||||
self.aprsis_patcher = mock.patch('aprsd.client.drivers.aprsis.APRSISDriver')
|
|
||||||
mock_aprsis_class = self.aprsis_patcher.start()
|
|
||||||
mock_aprsis_class.is_enabled.return_value = False
|
|
||||||
mock_aprsis_class.is_configured.return_value = False
|
|
||||||
|
|
||||||
# Mock the instance methods as well
|
|
||||||
mock_instance = mock_aprsis_class.return_value
|
|
||||||
mock_instance.is_enabled.return_value = False
|
|
||||||
mock_instance.is_configured.return_value = False
|
|
||||||
|
|
||||||
# Mock CONF to prevent password check
|
|
||||||
self.conf_patcher = mock.patch('aprsd.client.drivers.aprsis.CONF')
|
|
||||||
mock_conf = self.conf_patcher.start()
|
|
||||||
mock_conf.aprs_network.password = 'dummy'
|
|
||||||
mock_conf.aprs_network.login = 'dummy'
|
|
||||||
|
|
||||||
def tearDown(self):
|
|
||||||
# Reset the singleton instance after each test
|
|
||||||
DriverRegistry().drivers = []
|
|
||||||
self.aprsis_patcher.stop()
|
|
||||||
self.conf_patcher.stop()
|
|
||||||
|
|
||||||
def test_get_driver_with_valid_driver(self):
|
|
||||||
"""Test getting an enabled and configured driver."""
|
|
||||||
# Add an enabled and configured driver
|
|
||||||
driver = MockClientDriver
|
|
||||||
driver.is_enabled = mock.MagicMock(return_value=True)
|
|
||||||
driver.is_configured = mock.MagicMock(return_value=True)
|
|
||||||
self.registry.register(MockClientDriver)
|
|
||||||
|
|
||||||
# Get the driver
|
|
||||||
result = self.registry.get_driver()
|
|
||||||
print(result)
|
|
||||||
self.assertTrue(isinstance(result, MockClientDriver))
|
|
||||||
|
|
||||||
def test_get_driver_with_disabled_driver(self):
|
|
||||||
"""Test getting a driver when only disabled drivers exist."""
|
|
||||||
driver = MockClientDriver
|
|
||||||
driver.is_enabled = mock.MagicMock(return_value=False)
|
|
||||||
driver.is_configured = mock.MagicMock(return_value=False)
|
|
||||||
self.registry.register(driver)
|
|
||||||
|
|
||||||
with self.assertRaises(ValueError) as context:
|
|
||||||
self.registry.get_driver()
|
|
||||||
self.assertIn('No enabled driver found', str(context.exception))
|
|
||||||
|
|
||||||
def test_get_driver_with_unconfigured_driver(self):
|
|
||||||
"""Test getting a driver when only unconfigured drivers exist."""
|
|
||||||
driver = MockClientDriver
|
|
||||||
driver.is_enabled = mock.MagicMock(return_value=True)
|
|
||||||
driver.is_configured = mock.MagicMock(return_value=False)
|
|
||||||
self.registry.register(driver)
|
|
||||||
|
|
||||||
with self.assertRaises(ValueError) as context:
|
|
||||||
self.registry.get_driver()
|
|
||||||
self.assertIn('No enabled driver found', str(context.exception))
|
|
||||||
|
|
||||||
def test_get_driver_with_no_drivers(self):
|
|
||||||
"""Test getting a driver when no drivers exist."""
|
|
||||||
# Try to get a driver
|
|
||||||
with self.assertRaises(ValueError) as context:
|
|
||||||
self.registry.get_driver()
|
|
||||||
self.assertIn('No enabled driver found', str(context.exception))
|
|
||||||
|
|
||||||
def test_get_driver_with_multiple_drivers(self):
|
|
||||||
"""Test getting a driver when multiple valid drivers exist."""
|
|
||||||
# Add multiple drivers
|
|
||||||
driver1 = MockClientDriver
|
|
||||||
driver1.is_enabled = mock.MagicMock(return_value=True)
|
|
||||||
driver1.is_configured = mock.MagicMock(return_value=True)
|
|
||||||
driver2 = MockClientDriver
|
|
||||||
self.registry.register(driver1)
|
|
||||||
self.registry.register(driver2)
|
|
||||||
|
|
||||||
# Get the driver - should return the first one
|
|
||||||
result = self.registry.get_driver()
|
|
||||||
# We can only check that it's a MockDriver instance
|
|
||||||
self.assertTrue(isinstance(result, MockClientDriver))
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == '__main__':
|
|
||||||
unittest.main()
|
|
@ -1,76 +0,0 @@
|
|||||||
from unittest import mock
|
|
||||||
|
|
||||||
from aprsd.packets import core
|
|
||||||
|
|
||||||
|
|
||||||
class MockClientDriver:
|
|
||||||
"""Mock implementation of ClientDriver for testing."""
|
|
||||||
|
|
||||||
def __init__(self, enabled=True, configured=True):
|
|
||||||
self.connected = False
|
|
||||||
self._alive = True
|
|
||||||
self._keepalive = None
|
|
||||||
self.filter = None
|
|
||||||
self._enabled = enabled
|
|
||||||
self._configured = configured
|
|
||||||
self.path = '/dev/ttyUSB0'
|
|
||||||
self.login_status = {
|
|
||||||
'success': True,
|
|
||||||
'message': None,
|
|
||||||
}
|
|
||||||
|
|
||||||
@staticmethod
|
|
||||||
def is_enabled():
|
|
||||||
"""Static method to check if driver is enabled."""
|
|
||||||
return True
|
|
||||||
|
|
||||||
@staticmethod
|
|
||||||
def is_configured():
|
|
||||||
"""Static method to check if driver is configured."""
|
|
||||||
return True
|
|
||||||
|
|
||||||
def is_alive(self):
|
|
||||||
"""Instance method to check if driver is alive."""
|
|
||||||
return self._alive
|
|
||||||
|
|
||||||
def stats(self, serializable=False):
|
|
||||||
"""Return mock stats."""
|
|
||||||
stats = {'packets_received': 0, 'packets_sent': 0}
|
|
||||||
if serializable:
|
|
||||||
stats['path'] = self.path
|
|
||||||
return stats
|
|
||||||
|
|
||||||
@property
|
|
||||||
def login_success(self):
|
|
||||||
"""Property to get login success status."""
|
|
||||||
return self.login_status['success']
|
|
||||||
|
|
||||||
@property
|
|
||||||
def login_failure(self):
|
|
||||||
"""Property to get login failure message."""
|
|
||||||
return self.login_status['message']
|
|
||||||
|
|
||||||
def decode_packet(self, *args, **kwargs):
|
|
||||||
"""Mock packet decoding."""
|
|
||||||
packet = mock.MagicMock(spec=core.Packet)
|
|
||||||
packet.raw = 'test packet'
|
|
||||||
return packet
|
|
||||||
|
|
||||||
def close(self):
|
|
||||||
self.connected = False
|
|
||||||
|
|
||||||
def setup_connection(self):
|
|
||||||
self.connected = True
|
|
||||||
|
|
||||||
def send(self, packet):
|
|
||||||
return True
|
|
||||||
|
|
||||||
def set_filter(self, filter_str):
|
|
||||||
self.filter = filter_str
|
|
||||||
|
|
||||||
@property
|
|
||||||
def keepalive(self):
|
|
||||||
return self._keepalive
|
|
||||||
|
|
||||||
def consumer(self, callback, raw=False):
|
|
||||||
pass
|
|
@ -7,11 +7,9 @@ from aprsd import ( # noqa: F401
|
|||||||
conf,
|
conf,
|
||||||
packets,
|
packets,
|
||||||
)
|
)
|
||||||
from aprsd.client.drivers.registry import DriverRegistry
|
|
||||||
from aprsd.plugins import notify as notify_plugin
|
from aprsd.plugins import notify as notify_plugin
|
||||||
|
|
||||||
from .. import fake, test_plugin
|
from .. import fake, test_plugin
|
||||||
from ..mock_client_driver import MockClientDriver
|
|
||||||
|
|
||||||
CONF = cfg.CONF
|
CONF = cfg.CONF
|
||||||
DEFAULT_WATCHLIST_CALLSIGNS = fake.FAKE_FROM_CALLSIGN
|
DEFAULT_WATCHLIST_CALLSIGNS = fake.FAKE_FROM_CALLSIGN
|
||||||
@ -19,24 +17,9 @@ DEFAULT_WATCHLIST_CALLSIGNS = fake.FAKE_FROM_CALLSIGN
|
|||||||
|
|
||||||
class TestWatchListPlugin(test_plugin.TestPlugin):
|
class TestWatchListPlugin(test_plugin.TestPlugin):
|
||||||
def setUp(self):
|
def setUp(self):
|
||||||
super().setUp()
|
|
||||||
self.fromcall = fake.FAKE_FROM_CALLSIGN
|
self.fromcall = fake.FAKE_FROM_CALLSIGN
|
||||||
self.ack = 1
|
self.ack = 1
|
||||||
|
|
||||||
# Mock APRSISDriver
|
|
||||||
self.aprsis_patcher = mock.patch('aprsd.client.drivers.aprsis.APRSISDriver')
|
|
||||||
self.mock_aprsis = self.aprsis_patcher.start()
|
|
||||||
self.mock_aprsis.is_enabled.return_value = False
|
|
||||||
self.mock_aprsis.is_configured.return_value = False
|
|
||||||
|
|
||||||
# Register the mock driver
|
|
||||||
DriverRegistry().register(MockClientDriver)
|
|
||||||
|
|
||||||
def tearDown(self):
|
|
||||||
super().tearDown()
|
|
||||||
if hasattr(self, 'aprsis_patcher'):
|
|
||||||
self.aprsis_patcher.stop()
|
|
||||||
|
|
||||||
def config_and_init(
|
def config_and_init(
|
||||||
self,
|
self,
|
||||||
watchlist_enabled=True,
|
watchlist_enabled=True,
|
||||||
@ -47,9 +30,7 @@ class TestWatchListPlugin(test_plugin.TestPlugin):
|
|||||||
):
|
):
|
||||||
CONF.callsign = self.fromcall
|
CONF.callsign = self.fromcall
|
||||||
CONF.aprs_network.login = self.fromcall
|
CONF.aprs_network.login = self.fromcall
|
||||||
CONF.aprs_fi.apiKey = 'something'
|
CONF.aprs_fi.apiKey = "something"
|
||||||
# Add mock password
|
|
||||||
CONF.aprs_network.password = '12345'
|
|
||||||
|
|
||||||
# Set the watchlist specific config options
|
# Set the watchlist specific config options
|
||||||
CONF.watch_list.enabled = watchlist_enabled
|
CONF.watch_list.enabled = watchlist_enabled
|
||||||
@ -75,20 +56,22 @@ class TestAPRSDWatchListPluginBase(TestWatchListPlugin):
|
|||||||
plugin = fake.FakeWatchListPlugin()
|
plugin = fake.FakeWatchListPlugin()
|
||||||
|
|
||||||
packet = fake.fake_packet(
|
packet = fake.fake_packet(
|
||||||
message='version',
|
message="version",
|
||||||
msg_number=1,
|
msg_number=1,
|
||||||
)
|
)
|
||||||
actual = plugin.filter(packet)
|
actual = plugin.filter(packet)
|
||||||
expected = packets.NULL_MESSAGE
|
expected = packets.NULL_MESSAGE
|
||||||
self.assertEqual(expected, actual)
|
self.assertEqual(expected, actual)
|
||||||
|
|
||||||
def test_watchlist_not_in_watchlist(self):
|
@mock.patch("aprsd.client.factory.ClientFactory", autospec=True)
|
||||||
|
def test_watchlist_not_in_watchlist(self, mock_factory):
|
||||||
|
client.client_factory = mock_factory
|
||||||
self.config_and_init()
|
self.config_and_init()
|
||||||
plugin = fake.FakeWatchListPlugin()
|
plugin = fake.FakeWatchListPlugin()
|
||||||
|
|
||||||
packet = fake.fake_packet(
|
packet = fake.fake_packet(
|
||||||
fromcall='FAKE',
|
fromcall="FAKE",
|
||||||
message='version',
|
message="version",
|
||||||
msg_number=1,
|
msg_number=1,
|
||||||
)
|
)
|
||||||
actual = plugin.filter(packet)
|
actual = plugin.filter(packet)
|
||||||
@ -102,77 +85,87 @@ class TestNotifySeenPlugin(TestWatchListPlugin):
|
|||||||
plugin = notify_plugin.NotifySeenPlugin()
|
plugin = notify_plugin.NotifySeenPlugin()
|
||||||
|
|
||||||
packet = fake.fake_packet(
|
packet = fake.fake_packet(
|
||||||
message='version',
|
message="version",
|
||||||
msg_number=1,
|
msg_number=1,
|
||||||
)
|
)
|
||||||
actual = plugin.filter(packet)
|
actual = plugin.filter(packet)
|
||||||
expected = packets.NULL_MESSAGE
|
expected = packets.NULL_MESSAGE
|
||||||
self.assertEqual(expected, actual)
|
self.assertEqual(expected, actual)
|
||||||
|
|
||||||
def test_callsign_not_in_watchlist(self):
|
@mock.patch("aprsd.client.factory.ClientFactory", autospec=True)
|
||||||
|
def test_callsign_not_in_watchlist(self, mock_factory):
|
||||||
|
client.client_factory = mock_factory
|
||||||
self.config_and_init(watchlist_enabled=False)
|
self.config_and_init(watchlist_enabled=False)
|
||||||
plugin = notify_plugin.NotifySeenPlugin()
|
plugin = notify_plugin.NotifySeenPlugin()
|
||||||
|
|
||||||
packet = fake.fake_packet(
|
packet = fake.fake_packet(
|
||||||
message='version',
|
message="version",
|
||||||
msg_number=1,
|
msg_number=1,
|
||||||
)
|
)
|
||||||
actual = plugin.filter(packet)
|
actual = plugin.filter(packet)
|
||||||
expected = packets.NULL_MESSAGE
|
expected = packets.NULL_MESSAGE
|
||||||
self.assertEqual(expected, actual)
|
self.assertEqual(expected, actual)
|
||||||
|
|
||||||
@mock.patch('aprsd.packets.WatchList.is_old')
|
@mock.patch("aprsd.client.factory.ClientFactory", autospec=True)
|
||||||
def test_callsign_in_watchlist_not_old(self, mock_is_old):
|
@mock.patch("aprsd.packets.WatchList.is_old")
|
||||||
|
def test_callsign_in_watchlist_not_old(self, mock_is_old, mock_factory):
|
||||||
|
client.client_factory = mock_factory
|
||||||
mock_is_old.return_value = False
|
mock_is_old.return_value = False
|
||||||
self.config_and_init(
|
self.config_and_init(
|
||||||
watchlist_enabled=True,
|
watchlist_enabled=True,
|
||||||
watchlist_callsigns=['WB4BOR'],
|
watchlist_callsigns=["WB4BOR"],
|
||||||
)
|
)
|
||||||
plugin = notify_plugin.NotifySeenPlugin()
|
plugin = notify_plugin.NotifySeenPlugin()
|
||||||
|
|
||||||
packet = fake.fake_packet(
|
packet = fake.fake_packet(
|
||||||
fromcall='WB4BOR',
|
fromcall="WB4BOR",
|
||||||
message='ping',
|
message="ping",
|
||||||
msg_number=1,
|
msg_number=1,
|
||||||
)
|
)
|
||||||
actual = plugin.filter(packet)
|
actual = plugin.filter(packet)
|
||||||
expected = packets.NULL_MESSAGE
|
expected = packets.NULL_MESSAGE
|
||||||
self.assertEqual(expected, actual)
|
self.assertEqual(expected, actual)
|
||||||
|
|
||||||
@mock.patch('aprsd.packets.WatchList.is_old')
|
@mock.patch("aprsd.client.factory.ClientFactory", autospec=True)
|
||||||
def test_callsign_in_watchlist_old_same_alert_callsign(self, mock_is_old):
|
@mock.patch("aprsd.packets.WatchList.is_old")
|
||||||
|
def test_callsign_in_watchlist_old_same_alert_callsign(
|
||||||
|
self, mock_is_old, mock_factory
|
||||||
|
):
|
||||||
|
client.client_factory = mock_factory
|
||||||
mock_is_old.return_value = True
|
mock_is_old.return_value = True
|
||||||
self.config_and_init(
|
self.config_and_init(
|
||||||
watchlist_enabled=True,
|
watchlist_enabled=True,
|
||||||
watchlist_alert_callsign='WB4BOR',
|
watchlist_alert_callsign="WB4BOR",
|
||||||
watchlist_callsigns=['WB4BOR'],
|
watchlist_callsigns=["WB4BOR"],
|
||||||
)
|
)
|
||||||
plugin = notify_plugin.NotifySeenPlugin()
|
plugin = notify_plugin.NotifySeenPlugin()
|
||||||
|
|
||||||
packet = fake.fake_packet(
|
packet = fake.fake_packet(
|
||||||
fromcall='WB4BOR',
|
fromcall="WB4BOR",
|
||||||
message='ping',
|
message="ping",
|
||||||
msg_number=1,
|
msg_number=1,
|
||||||
)
|
)
|
||||||
actual = plugin.filter(packet)
|
actual = plugin.filter(packet)
|
||||||
expected = packets.NULL_MESSAGE
|
expected = packets.NULL_MESSAGE
|
||||||
self.assertEqual(expected, actual)
|
self.assertEqual(expected, actual)
|
||||||
|
|
||||||
@mock.patch('aprsd.packets.WatchList.is_old')
|
@mock.patch("aprsd.client.factory.ClientFactory", autospec=True)
|
||||||
def test_callsign_in_watchlist_old_send_alert(self, mock_is_old):
|
@mock.patch("aprsd.packets.WatchList.is_old")
|
||||||
|
def test_callsign_in_watchlist_old_send_alert(self, mock_is_old, mock_factory):
|
||||||
|
client.client_factory = mock_factory
|
||||||
mock_is_old.return_value = True
|
mock_is_old.return_value = True
|
||||||
notify_callsign = fake.FAKE_TO_CALLSIGN
|
notify_callsign = fake.FAKE_TO_CALLSIGN
|
||||||
fromcall = 'WB4BOR'
|
fromcall = "WB4BOR"
|
||||||
self.config_and_init(
|
self.config_and_init(
|
||||||
watchlist_enabled=True,
|
watchlist_enabled=True,
|
||||||
watchlist_alert_callsign=notify_callsign,
|
watchlist_alert_callsign=notify_callsign,
|
||||||
watchlist_callsigns=['WB4BOR'],
|
watchlist_callsigns=["WB4BOR"],
|
||||||
)
|
)
|
||||||
plugin = notify_plugin.NotifySeenPlugin()
|
plugin = notify_plugin.NotifySeenPlugin()
|
||||||
|
|
||||||
packet = fake.fake_packet(
|
packet = fake.fake_packet(
|
||||||
fromcall=fromcall,
|
fromcall=fromcall,
|
||||||
message='ping',
|
message="ping",
|
||||||
msg_number=1,
|
msg_number=1,
|
||||||
)
|
)
|
||||||
packet_type = packet.__class__.__name__
|
packet_type = packet.__class__.__name__
|
||||||
|
@ -3,7 +3,6 @@ from unittest import mock
|
|||||||
from oslo_config import cfg
|
from oslo_config import cfg
|
||||||
|
|
||||||
import aprsd
|
import aprsd
|
||||||
from aprsd.client.drivers.fake import APRSDFakeDriver
|
|
||||||
from aprsd.plugins import version as version_plugin
|
from aprsd.plugins import version as version_plugin
|
||||||
|
|
||||||
from .. import fake, test_plugin
|
from .. import fake, test_plugin
|
||||||
@ -12,41 +11,16 @@ CONF = cfg.CONF
|
|||||||
|
|
||||||
|
|
||||||
class TestVersionPlugin(test_plugin.TestPlugin):
|
class TestVersionPlugin(test_plugin.TestPlugin):
|
||||||
def setUp(self):
|
@mock.patch("aprsd.stats.app.APRSDStats.uptime")
|
||||||
# make sure the fake client driver is enabled
|
def test_version(self, mock_stats):
|
||||||
# Mock CONF for testing
|
mock_stats.return_value = "00:00:00"
|
||||||
super().setUp()
|
expected = f"APRSD ver:{aprsd.__version__} uptime:00:00:00"
|
||||||
self.conf_patcher = mock.patch('aprsd.client.drivers.fake.CONF')
|
|
||||||
self.mock_conf = self.conf_patcher.start()
|
|
||||||
|
|
||||||
# Configure fake_client.enabled
|
|
||||||
self.mock_conf.fake_client.enabled = True
|
|
||||||
|
|
||||||
# Create an instance of the driver
|
|
||||||
self.driver = APRSDFakeDriver()
|
|
||||||
self.fromcall = fake.FAKE_FROM_CALLSIGN
|
|
||||||
|
|
||||||
def tearDown(self):
|
|
||||||
self.conf_patcher.stop()
|
|
||||||
super().tearDown()
|
|
||||||
|
|
||||||
@mock.patch('aprsd.stats.collector.Collector')
|
|
||||||
def test_version(self, mock_collector_class):
|
|
||||||
# Set up the mock collector instance
|
|
||||||
mock_collector_instance = mock_collector_class.return_value
|
|
||||||
mock_collector_instance.collect.return_value = {
|
|
||||||
'APRSDStats': {
|
|
||||||
'uptime': '00:00:00',
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
expected = f'APRSD ver:{aprsd.__version__} uptime:00:00:00'
|
|
||||||
CONF.callsign = fake.FAKE_TO_CALLSIGN
|
CONF.callsign = fake.FAKE_TO_CALLSIGN
|
||||||
version = version_plugin.VersionPlugin()
|
version = version_plugin.VersionPlugin()
|
||||||
version.enabled = True
|
version.enabled = True
|
||||||
|
|
||||||
packet = fake.fake_packet(
|
packet = fake.fake_packet(
|
||||||
message='No',
|
message="No",
|
||||||
msg_number=1,
|
msg_number=1,
|
||||||
)
|
)
|
||||||
|
|
||||||
@ -54,11 +28,8 @@ class TestVersionPlugin(test_plugin.TestPlugin):
|
|||||||
self.assertEqual(None, actual)
|
self.assertEqual(None, actual)
|
||||||
|
|
||||||
packet = fake.fake_packet(
|
packet = fake.fake_packet(
|
||||||
message='version',
|
message="version",
|
||||||
msg_number=1,
|
msg_number=1,
|
||||||
)
|
)
|
||||||
actual = version.filter(packet)
|
actual = version.filter(packet)
|
||||||
self.assertEqual(expected, actual)
|
self.assertEqual(expected, actual)
|
||||||
|
|
||||||
# Verify the mock was called exactly once
|
|
||||||
mock_collector_instance.collect.assert_called_once()
|
|
||||||
|
@ -9,11 +9,9 @@ from aprsd import ( # noqa: F401
|
|||||||
plugins,
|
plugins,
|
||||||
)
|
)
|
||||||
from aprsd import plugin as aprsd_plugin
|
from aprsd import plugin as aprsd_plugin
|
||||||
from aprsd.client.drivers.registry import DriverRegistry
|
|
||||||
from aprsd.packets import core
|
from aprsd.packets import core
|
||||||
|
|
||||||
from . import fake
|
from . import fake
|
||||||
from .mock_client_driver import MockClientDriver
|
|
||||||
|
|
||||||
CONF = cfg.CONF
|
CONF = cfg.CONF
|
||||||
|
|
||||||
@ -23,24 +21,15 @@ class TestPluginManager(unittest.TestCase):
|
|||||||
self.fromcall = fake.FAKE_FROM_CALLSIGN
|
self.fromcall = fake.FAKE_FROM_CALLSIGN
|
||||||
self.config_and_init()
|
self.config_and_init()
|
||||||
|
|
||||||
self.mock_driver = MockClientDriver()
|
|
||||||
# Mock the DriverRegistry to return our mock driver
|
|
||||||
self.registry_patcher = mock.patch.object(
|
|
||||||
DriverRegistry, 'get_driver', return_value=self.mock_driver
|
|
||||||
)
|
|
||||||
self.mock_registry = self.registry_patcher.start()
|
|
||||||
|
|
||||||
def tearDown(self) -> None:
|
def tearDown(self) -> None:
|
||||||
self.config = None
|
self.config = None
|
||||||
aprsd_plugin.PluginManager._instance = None
|
aprsd_plugin.PluginManager._instance = None
|
||||||
self.registry_patcher.stop()
|
|
||||||
self.mock_registry.stop()
|
|
||||||
|
|
||||||
def config_and_init(self):
|
def config_and_init(self):
|
||||||
CONF.callsign = self.fromcall
|
CONF.callsign = self.fromcall
|
||||||
CONF.aprs_network.login = fake.FAKE_TO_CALLSIGN
|
CONF.aprs_network.login = fake.FAKE_TO_CALLSIGN
|
||||||
CONF.aprs_fi.apiKey = 'something'
|
CONF.aprs_fi.apiKey = "something"
|
||||||
CONF.enabled_plugins = 'aprsd.plugins.ping.PingPlugin'
|
CONF.enabled_plugins = "aprsd.plugins.ping.PingPlugin"
|
||||||
CONF.enable_save = False
|
CONF.enable_save = False
|
||||||
|
|
||||||
def test_get_plugins_no_plugins(self):
|
def test_get_plugins_no_plugins(self):
|
||||||
@ -50,7 +39,7 @@ class TestPluginManager(unittest.TestCase):
|
|||||||
self.assertEqual([], plugin_list)
|
self.assertEqual([], plugin_list)
|
||||||
|
|
||||||
def test_get_plugins_with_plugins(self):
|
def test_get_plugins_with_plugins(self):
|
||||||
CONF.enabled_plugins = ['aprsd.plugins.ping.PingPlugin']
|
CONF.enabled_plugins = ["aprsd.plugins.ping.PingPlugin"]
|
||||||
pm = aprsd_plugin.PluginManager()
|
pm = aprsd_plugin.PluginManager()
|
||||||
plugin_list = pm.get_plugins()
|
plugin_list = pm.get_plugins()
|
||||||
self.assertEqual([], plugin_list)
|
self.assertEqual([], plugin_list)
|
||||||
@ -75,7 +64,7 @@ class TestPluginManager(unittest.TestCase):
|
|||||||
self.assertEqual(0, len(plugin_list))
|
self.assertEqual(0, len(plugin_list))
|
||||||
|
|
||||||
def test_get_message_plugins(self):
|
def test_get_message_plugins(self):
|
||||||
CONF.enabled_plugins = ['aprsd.plugins.ping.PingPlugin']
|
CONF.enabled_plugins = ["aprsd.plugins.ping.PingPlugin"]
|
||||||
pm = aprsd_plugin.PluginManager()
|
pm = aprsd_plugin.PluginManager()
|
||||||
plugin_list = pm.get_plugins()
|
plugin_list = pm.get_plugins()
|
||||||
self.assertEqual([], plugin_list)
|
self.assertEqual([], plugin_list)
|
||||||
@ -98,31 +87,22 @@ class TestPlugin(unittest.TestCase):
|
|||||||
self.ack = 1
|
self.ack = 1
|
||||||
self.config_and_init()
|
self.config_and_init()
|
||||||
|
|
||||||
self.mock_driver = MockClientDriver()
|
|
||||||
# Mock the DriverRegistry to return our mock driver
|
|
||||||
self.registry_patcher = mock.patch.object(
|
|
||||||
DriverRegistry, 'get_driver', return_value=self.mock_driver
|
|
||||||
)
|
|
||||||
self.mock_registry = self.registry_patcher.start()
|
|
||||||
|
|
||||||
def tearDown(self) -> None:
|
def tearDown(self) -> None:
|
||||||
packets.WatchList._instance = None
|
packets.WatchList._instance = None
|
||||||
packets.SeenList._instance = None
|
packets.SeenList._instance = None
|
||||||
packets.PacketTrack._instance = None
|
packets.PacketTrack._instance = None
|
||||||
self.config = None
|
self.config = None
|
||||||
self.registry_patcher.stop()
|
|
||||||
self.mock_registry.stop()
|
|
||||||
|
|
||||||
def config_and_init(self):
|
def config_and_init(self):
|
||||||
CONF.callsign = self.fromcall
|
CONF.callsign = self.fromcall
|
||||||
CONF.aprs_network.login = fake.FAKE_TO_CALLSIGN
|
CONF.aprs_network.login = fake.FAKE_TO_CALLSIGN
|
||||||
CONF.aprs_fi.apiKey = 'something'
|
CONF.aprs_fi.apiKey = "something"
|
||||||
CONF.enabled_plugins = 'aprsd.plugins.ping.PingPlugin'
|
CONF.enabled_plugins = "aprsd.plugins.ping.PingPlugin"
|
||||||
CONF.enable_save = False
|
CONF.enable_save = False
|
||||||
|
|
||||||
|
|
||||||
class TestPluginBase(TestPlugin):
|
class TestPluginBase(TestPlugin):
|
||||||
@mock.patch.object(fake.FakeBaseNoThreadsPlugin, 'process')
|
@mock.patch.object(fake.FakeBaseNoThreadsPlugin, "process")
|
||||||
def test_base_plugin_no_threads(self, mock_process):
|
def test_base_plugin_no_threads(self, mock_process):
|
||||||
p = fake.FakeBaseNoThreadsPlugin()
|
p = fake.FakeBaseNoThreadsPlugin()
|
||||||
|
|
||||||
@ -130,7 +110,7 @@ class TestPluginBase(TestPlugin):
|
|||||||
actual = p.create_threads()
|
actual = p.create_threads()
|
||||||
self.assertEqual(expected, actual)
|
self.assertEqual(expected, actual)
|
||||||
|
|
||||||
expected = '1.0'
|
expected = "1.0"
|
||||||
actual = p.version
|
actual = p.version
|
||||||
self.assertEqual(expected, actual)
|
self.assertEqual(expected, actual)
|
||||||
|
|
||||||
@ -143,7 +123,7 @@ class TestPluginBase(TestPlugin):
|
|||||||
self.assertEqual(expected, actual)
|
self.assertEqual(expected, actual)
|
||||||
mock_process.assert_not_called()
|
mock_process.assert_not_called()
|
||||||
|
|
||||||
@mock.patch.object(fake.FakeBaseThreadsPlugin, 'create_threads')
|
@mock.patch.object(fake.FakeBaseThreadsPlugin, "create_threads")
|
||||||
def test_base_plugin_threads_created(self, mock_create):
|
def test_base_plugin_threads_created(self, mock_create):
|
||||||
p = fake.FakeBaseThreadsPlugin()
|
p = fake.FakeBaseThreadsPlugin()
|
||||||
mock_create.assert_called_once()
|
mock_create.assert_called_once()
|
||||||
@ -155,17 +135,17 @@ class TestPluginBase(TestPlugin):
|
|||||||
self.assertTrue(isinstance(actual, fake.FakeThread))
|
self.assertTrue(isinstance(actual, fake.FakeThread))
|
||||||
p.stop_threads()
|
p.stop_threads()
|
||||||
|
|
||||||
@mock.patch.object(fake.FakeRegexCommandPlugin, 'process')
|
@mock.patch.object(fake.FakeRegexCommandPlugin, "process")
|
||||||
def test_regex_base_not_called(self, mock_process):
|
def test_regex_base_not_called(self, mock_process):
|
||||||
CONF.callsign = fake.FAKE_TO_CALLSIGN
|
CONF.callsign = fake.FAKE_TO_CALLSIGN
|
||||||
p = fake.FakeRegexCommandPlugin()
|
p = fake.FakeRegexCommandPlugin()
|
||||||
packet = fake.fake_packet(message='a')
|
packet = fake.fake_packet(message="a")
|
||||||
expected = None
|
expected = None
|
||||||
actual = p.filter(packet)
|
actual = p.filter(packet)
|
||||||
self.assertEqual(expected, actual)
|
self.assertEqual(expected, actual)
|
||||||
mock_process.assert_not_called()
|
mock_process.assert_not_called()
|
||||||
|
|
||||||
packet = fake.fake_packet(tocall='notMe', message='f')
|
packet = fake.fake_packet(tocall="notMe", message="f")
|
||||||
expected = None
|
expected = None
|
||||||
actual = p.filter(packet)
|
actual = p.filter(packet)
|
||||||
self.assertEqual(expected, actual)
|
self.assertEqual(expected, actual)
|
||||||
@ -185,11 +165,11 @@ class TestPluginBase(TestPlugin):
|
|||||||
self.assertEqual(expected, actual)
|
self.assertEqual(expected, actual)
|
||||||
mock_process.assert_not_called()
|
mock_process.assert_not_called()
|
||||||
|
|
||||||
@mock.patch.object(fake.FakeRegexCommandPlugin, 'process')
|
@mock.patch.object(fake.FakeRegexCommandPlugin, "process")
|
||||||
def test_regex_base_assert_called(self, mock_process):
|
def test_regex_base_assert_called(self, mock_process):
|
||||||
CONF.callsign = fake.FAKE_TO_CALLSIGN
|
CONF.callsign = fake.FAKE_TO_CALLSIGN
|
||||||
p = fake.FakeRegexCommandPlugin()
|
p = fake.FakeRegexCommandPlugin()
|
||||||
packet = fake.fake_packet(message='f')
|
packet = fake.fake_packet(message="f")
|
||||||
p.filter(packet)
|
p.filter(packet)
|
||||||
mock_process.assert_called_once()
|
mock_process.assert_called_once()
|
||||||
|
|
||||||
@ -197,22 +177,22 @@ class TestPluginBase(TestPlugin):
|
|||||||
CONF.callsign = fake.FAKE_TO_CALLSIGN
|
CONF.callsign = fake.FAKE_TO_CALLSIGN
|
||||||
p = fake.FakeRegexCommandPlugin()
|
p = fake.FakeRegexCommandPlugin()
|
||||||
|
|
||||||
packet = fake.fake_packet(message='f')
|
packet = fake.fake_packet(message="f")
|
||||||
expected = fake.FAKE_MESSAGE_TEXT
|
expected = fake.FAKE_MESSAGE_TEXT
|
||||||
actual = p.filter(packet)
|
actual = p.filter(packet)
|
||||||
self.assertEqual(expected, actual)
|
self.assertEqual(expected, actual)
|
||||||
|
|
||||||
packet = fake.fake_packet(message='F')
|
packet = fake.fake_packet(message="F")
|
||||||
expected = fake.FAKE_MESSAGE_TEXT
|
expected = fake.FAKE_MESSAGE_TEXT
|
||||||
actual = p.filter(packet)
|
actual = p.filter(packet)
|
||||||
self.assertEqual(expected, actual)
|
self.assertEqual(expected, actual)
|
||||||
|
|
||||||
packet = fake.fake_packet(message='fake')
|
packet = fake.fake_packet(message="fake")
|
||||||
expected = fake.FAKE_MESSAGE_TEXT
|
expected = fake.FAKE_MESSAGE_TEXT
|
||||||
actual = p.filter(packet)
|
actual = p.filter(packet)
|
||||||
self.assertEqual(expected, actual)
|
self.assertEqual(expected, actual)
|
||||||
|
|
||||||
packet = fake.fake_packet(message='FAKE')
|
packet = fake.fake_packet(message="FAKE")
|
||||||
expected = fake.FAKE_MESSAGE_TEXT
|
expected = fake.FAKE_MESSAGE_TEXT
|
||||||
actual = p.filter(packet)
|
actual = p.filter(packet)
|
||||||
self.assertEqual(expected, actual)
|
self.assertEqual(expected, actual)
|
||||||
|
2
tox.ini
2
tox.ini
@ -25,7 +25,7 @@ deps =
|
|||||||
pytest-cov
|
pytest-cov
|
||||||
pytest
|
pytest
|
||||||
commands =
|
commands =
|
||||||
pytest -s -v --cov-report term-missing --cov=aprsd {posargs}
|
pytest -v --cov-report term-missing --cov=aprsd {posargs}
|
||||||
coverage: coverage report -m
|
coverage: coverage report -m
|
||||||
coverage: coverage xml
|
coverage: coverage xml
|
||||||
|
|
||||||
|
Loading…
x
Reference in New Issue
Block a user