1
0
mirror of https://github.com/craigerl/aprsd.git synced 2025-08-03 22:12:30 -04:00

Compare commits

...

41 Commits

Author SHA1 Message Date
fa5d0c643a log the exception when tx fails.
If _send_direct fails, we log the exception so we can find out why
it failed to send.
2025-06-17 22:23:06 -04:00
74887af507 Fix tox failures
With the UnknownPacket
2025-06-17 10:51:42 -04:00
034f11b4f9 Make sure packet has addressee field
Before dereferencing it in the rx thread.
This covers the case for an Unknown Packet
2025-06-06 17:18:19 -04:00
1ae437581f Honor quiet setting for log.setup_logging
This updates the setup_logging() to honor the quiet setting.

This means that aprsd won't log to stdout when quiet is passed in
as True.
2025-04-24 20:45:13 -04:00
d41064ba05 Don't log an IOError on shutdown
This updates the consumer loop to only log an IOError if we are not
in the process of shutting down.
2025-04-24 20:44:07 -04:00
b201117e96
Merge pull request #191 from craigerl/client-refactor
Reworked the entire client and drivers
2025-04-23 21:06:13 -04:00
211ac1132b Updated requirements 2025-04-23 20:59:18 -04:00
acd639a910 Make some catchall fields non hashable.
This makes some of the packet fields non-hashable that
will eventually end up being dicts or lists.
2025-04-23 20:57:16 -04:00
ce79f0112f Remove flask_enabled 2025-04-23 20:52:58 -04:00
4c53c13e79 Ensure filter is set
Ensure the filter is set when a client reset happens
2025-04-23 20:52:02 -04:00
5469610779 Fixed a problem with WeatherPacket
WeatherPacket was calling self.filter_for_send, which doesn't
exist.  It's self._filter_for_send.
2025-04-23 20:52:02 -04:00
1c39546bb9 Reworked the entire client and drivers
This patch includes a completely reworked client structure.
There is now only 1 client object, that loads the appropriate
drivers.  The drivers are fake, aprsis and tcpkiss.

The TCPKISS client was written from scratch to avoid using asyncio.
Asyncion is nothing but a pain in the ass.
2025-04-23 20:52:02 -04:00
8f471c229c removed old flask_enabled global 2025-04-23 10:06:44 -04:00
2fcd574f12 Fixed setup for AVWXWeatherPlugin
This ensures that during setup() the plugin sets itself
to enabled if all checks pass.
2025-04-23 09:59:53 -04:00
44e2898c3f
Merge pull request #193 from craigerl/docker-compose
Update the docker-compose.yml
2025-04-22 09:16:37 -04:00
5a2bf6f3ba Update the docker-compose.yml
This updates the docker-compose.yml example file to show
how to install 2 plugins for the aprsd-server container
and how to deploy the admin interface container.
2025-04-21 15:53:14 -04:00
ad1e62b17c
Merge pull request #186 from pheezer/master
Use OpenWeatherMap One Call v3.0 API
2025-03-14 07:52:18 -04:00
Philip Duvall
bd83e53838
Merge branch 'master' into master 2025-03-07 22:06:44 -07:00
6dba56f74d Changelog for 4.1.2 2025-03-06 17:11:57 -05:00
d262589313 Allow passing in a custom handler to setup_logging
This adds the ability to pass in a custom log handler during
startup in the log.setup_logging().
2025-03-06 17:09:40 -05:00
7ed8028307 4.1.1 release 2025-03-05 17:00:12 -05:00
94ba915ed4 Fixed some more ruff checks
this updates some code to fix any more 'ruff check' failures.
2025-03-05 16:48:18 -05:00
2b185ee1b8 Update requirements
This removes the hard coded requirement for the old rich lib.
2025-03-04 14:01:27 -05:00
Philip Duvall
16bfda431c use OWM onecall v3.0 2025-03-01 09:29:05 -07:00
c1c89fd2c2 Added threads.service
This is just a handy wrapper to inject all the threads a service wants
to start and then start them all at once, and then join them all.
2025-02-28 17:16:53 -05:00
0fa5b07d4b Added new config to disable logging to console
This patch adds a new config setting to the logging
section that allows the user to disable logging to stdout.
This is useful for terminal UI apps :)
2025-02-26 17:41:40 -05:00
a3cda9f37d Update Changelog for 4.1.0 release 2025-02-20 15:15:31 -05:00
06bdb34642 CONF.logging.enable_color option added
This allows the user to turn off ANSI color output for
logging to the console.
2025-02-20 12:44:16 -05:00
4fd64a3c25
Merge pull request #184 from craigerl/packet_filtering
Added new PacketFilter mechanism
2025-02-19 16:45:10 -05:00
361663e7d2 Changed Objectstore log to debug
This should help silence the log a bit.
2025-02-19 16:38:48 -05:00
fd517b3218 updated gitignore 2025-02-19 16:38:48 -05:00
e9e7e6b59f Fixed some pep8 failures. 2025-02-19 16:38:47 -05:00
52dac7e0a0 Added new PacketFilter mechanism
This patch adds the new PacketFilter class as a generic mechanism
for doing packet filtering during the packet processing phase of
recieving packets.

The packet phases are:
1. reception and stats collection
2. packet processing.

Each phase has a single thread for handling that phase.

Phase 1:
The ARPSDRXThread connects to the APRS client, and gets packets
from the client.  Then it puts the packet through the Collector
for stats and tracking.  Then the packet is put into the packet_queue.

Phase 2:
Packets are pulled from the packet_queue.  Then packets are run
through the PacketFilter mechanism, then processed depending
on the command being run.
By default there is 1 loaded packet filter, which is the
DupePacketFilter which removes "duplicate" packets that aprsd has
already seen and processed within the configured time frame.

This PacketFilter mechanism allows an external extension or plugin
to add/remove packet filters at will depending on the function
of the extension or plugin.   For example, this allows an extension
to get a packet and push the packet into an MQTT queue.
2025-02-19 16:38:47 -05:00
b6da0ebb0d Fix runaway KISS driver on failed connnection
This patch fixes an issue when the KISS connection fails to start
and or goes away during the lifetime of the active connection.
Aprsd would runaway in a tight loop eating 100% cpu.  We now detect
when the underlying asyncio connection has failed and raise, which
induces a sleep in the consumer to try again.
2025-02-15 18:55:58 -05:00
d82a81a2c3 fix for None packet in rx thread
This patch updates the process_packet to ensure when we
try to decode a packet we actually get one.
2025-02-14 11:06:19 -05:00
6cd7e99713 Remove sleep in main RX thread
We had a bottleneck of pulling down packets as fast as possible,
which was caused by the time.sleep(1) call in the main RX
thread that used to be needed.
2025-02-03 13:27:57 -08:00
101904ca77 Try and stop chardet logging!
This patch sets settings on the logger to hopefully
stop any chardet logs from leaking into the aprsd logs.
2025-01-30 10:16:09 -08:00
227ddbf148 Update StatsStore to use existing lock
The base class for StatsStore already creates a lock, use that instead.
2025-01-30 10:07:28 -08:00
19c12e70f3 Updated packet_list to allow infinit max store
This patch adds logic of setting packet_list_stats_maxlen -1
meaning keep every packet for stats.
2025-01-30 10:04:59 -08:00
1606585d41 Updated APRSIS driver
This patch adds an override to _connect to set some more
socket keepalive options.
2025-01-30 10:03:14 -08:00
3b57e7597d Update to build from pypi 2025-01-25 13:41:00 -05:00
65 changed files with 3965 additions and 2412 deletions

View File

@ -41,8 +41,8 @@ jobs:
platforms: linux/amd64,linux/arm64
file: ./Dockerfile
build-args: |
INSTALL_TYPE=pypi
VERSION=${{ inputs.aprsd_version }}
BRANCH=${{ inputs.aprsd_version }}
BUILDX_QEMU_ENV=true
push: true
tags: |

6
.gitignore vendored
View File

@ -60,3 +60,9 @@ AUTHORS
Makefile.venv
# Copilot
.DS_Store
.python-version
.fleet
.vscode
.envrc
.doit.db

View File

@ -4,6 +4,41 @@ All notable changes to this project will be documented in this file. Dates are d
Generated by [`auto-changelog`](https://github.com/CookPete/auto-changelog).
#### [4.1.2](https://github.com/craigerl/aprsd/compare/4.1.1...4.1.2)
> 6 March 2025
- Allow passing in a custom handler to setup_logging [`d262589`](https://github.com/craigerl/aprsd/commit/d2625893134f498748859da3b1684b04d456f790)
#### [4.1.1](https://github.com/craigerl/aprsd/compare/4.1.0...4.1.1)
> 5 March 2025
- Added new config to disable logging to console [`0fa5b07`](https://github.com/craigerl/aprsd/commit/0fa5b07d4bf4bc5d5aaad1de52b78058e472fe24)
- Added threads.service [`c1c89fd`](https://github.com/craigerl/aprsd/commit/c1c89fd2c2c69c5e6c5d29a736a7b89e3d45cfe2)
- Update requirements [`2b185ee`](https://github.com/craigerl/aprsd/commit/2b185ee1b84598c832d8a5d73753cb428854b932)
- Fixed some more ruff checks [`94ba915`](https://github.com/craigerl/aprsd/commit/94ba915ed44b11eaabc885e033669d67d8c341a5)
- 4.1.1 release [`7ed8028`](https://github.com/craigerl/aprsd/commit/7ed80283071c1ccebf1e3373727608edd0a56ee9)
#### [4.1.0](https://github.com/craigerl/aprsd/compare/4.0.2...4.1.0)
> 20 February 2025
- Added new PacketFilter mechanism [`#184`](https://github.com/craigerl/aprsd/pull/184)
- Update to build from pypi [`3b57e75`](https://github.com/craigerl/aprsd/commit/3b57e7597d77303ffc03b082370283bb2fea2838)
- Updated APRSIS driver [`1606585`](https://github.com/craigerl/aprsd/commit/1606585d41f69133192199d139b53344bb320fa9)
- Updated packet_list to allow infinit max store [`19c12e7`](https://github.com/craigerl/aprsd/commit/19c12e70f30a6f1f7d223a2f0fd3bf1182579fa4)
- Update StatsStore to use existing lock [`227ddbf`](https://github.com/craigerl/aprsd/commit/227ddbf148be2e14d4b4f27e48a4b091a98f15df)
- Try and stop chardet logging! [`101904c`](https://github.com/craigerl/aprsd/commit/101904ca77d816ae9e70bc7d22e6d8516fc3c5ce)
- Fixed some pep8 failures. [`e9e7e6b`](https://github.com/craigerl/aprsd/commit/e9e7e6b59f9f93f3f09142e56407bc87603a44cb)
- updated gitignore [`fd517b3`](https://github.com/craigerl/aprsd/commit/fd517b32188fdf15835a74fbd515ce417e7ef1f5)
- Remove sleep in main RX thread [`6cd7e99`](https://github.com/craigerl/aprsd/commit/6cd7e997139e8f2687bee753d9e0d2b22b1c42a3)
- Changed Objectstore log to debug [`361663e`](https://github.com/craigerl/aprsd/commit/361663e7d2cf43bd2fd53da0d8c5205bb848dbc2)
- fix for None packet in rx thread [`d82a81a`](https://github.com/craigerl/aprsd/commit/d82a81a2c3c1a7f50177a0a6435a555daeb858aa)
- Fix runaway KISS driver on failed connnection [`b6da0eb`](https://github.com/craigerl/aprsd/commit/b6da0ebb0d2f4d7078dbbf91d8c03715412d89ea)
- CONF.logging.enable_color option added [`06bdb34`](https://github.com/craigerl/aprsd/commit/06bdb34642640d91ea96e3c6e8d8b5a4b8230611)
- Update Changelog for 4.1.0 release [`a3cda9f`](https://github.com/craigerl/aprsd/commit/a3cda9f37d4c9b955b523f46b2eb8cf412a84407)
#### [4.0.2](https://github.com/craigerl/aprsd/compare/4.0.1...4.0.2)
> 25 January 2025
@ -16,6 +51,7 @@ Generated by [`auto-changelog`](https://github.com/CookPete/auto-changelog).
- Added uv.lock [`2f26eb8`](https://github.com/craigerl/aprsd/commit/2f26eb86f44625547f72f7c3612494b1bc44bc99)
- Fix the testing of fortune path [`3c4e200`](https://github.com/craigerl/aprsd/commit/3c4e200d700c24125479bb754b5f68bdf35b85a6)
- update the install from github in Dockerfile [`bea4815`](https://github.com/craigerl/aprsd/commit/bea481555bc1270ab371a22c69973d648e526d54)
- Prep for 4.0.2 [`000adef`](https://github.com/craigerl/aprsd/commit/000adef6d4f2792d33980d59d37f4b139e0c693c)
#### [4.0.1](https://github.com/craigerl/aprsd/compare/4.0.0...4.0.1)
@ -24,11 +60,33 @@ Generated by [`auto-changelog`](https://github.com/CookPete/auto-changelog).
- Update pyproject for README.rst -> md [`e080394`](https://github.com/craigerl/aprsd/commit/e08039431ebde92a162ab422c05391dc55d3d3fa)
- Updated Changelog [`24f5672`](https://github.com/craigerl/aprsd/commit/24f567224cf8ecdebd51f49804425565883acb94)
### [4.0.0](https://github.com/craigerl/aprsd/compare/3.4.4...4.0.0)
### [4.0.0](https://github.com/craigerl/aprsd/compare/3.5.0...4.0.0)
> 24 January 2025
- Migrate admin web out of aprsd. [`#183`](https://github.com/craigerl/aprsd/pull/183)
- Enable packet stats for listen command in Docker [`e5d8796`](https://github.com/craigerl/aprsd/commit/e5d8796cda1a007aa868c760b96b50b364351519)
- Added activity to README [`cdd297c`](https://github.com/craigerl/aprsd/commit/cdd297c5bbc8b93f4739f5850a3e5971ce8baeba)
- Added star history to readme [`02e2940`](https://github.com/craigerl/aprsd/commit/02e29405ce2f8310e4f87f68498dfd6575c2e43b)
- removed pytest from README [`1cba31f`](https://github.com/craigerl/aprsd/commit/1cba31f0ac9bd5ee532721a909fc752f023f3b06)
- Updated Docker for using alpine and uv [`24db814`](https://github.com/craigerl/aprsd/commit/24db814c82c9bb6634566d7428603bf7a9ae37d1)
- Update the admin and setup.sh for container [`044ea4c`](https://github.com/craigerl/aprsd/commit/044ea4cc9a0059101851d6e722e986ee236833e8)
- added healthcheck.sh [`1054999`](https://github.com/craigerl/aprsd/commit/10549995686b08e4c166f780efdec5bdae496cab)
- updated healthcheck.sh [`dabb48c`](https://github.com/craigerl/aprsd/commit/dabb48c6f64062c1fed8f83a4f0b8ffba0c206a5)
- try making image for webchat [`ba8acdc`](https://github.com/craigerl/aprsd/commit/ba8acdc5849fc7b2d8a1ee11af6f5e317cf30f45)
- Added APRSD logo [`0ed648f`](https://github.com/craigerl/aprsd/commit/0ed648f8f8a961dbbd9e22bcebadcde525ee41ae)
- Added plugin and extension links [`447451c`](https://github.com/craigerl/aprsd/commit/447451c6c97e1f2d3d0bf580db21ecd176690258)
- reduced logo size 50% [`cf4a29f`](https://github.com/craigerl/aprsd/commit/cf4a29f0cb3ed366b21ec3120a189614e0955180)
- Updated README.md TOC [`375a5e5`](https://github.com/craigerl/aprsd/commit/375a5e5b34718cadc6ee8a51484fc91441440a61)
- chore: update AUTHORS [skip ci] [`c556f51`](https://github.com/craigerl/aprsd/commit/c556f5126f725904822a75427475d46986f8e9f3)
- Updated requirements [`4a7a902`](https://github.com/craigerl/aprsd/commit/4a7a902a337759a352560d4d92dc314b1726412a)
- Updated ChangeLog for 4.0.0 [`934ebd2`](https://github.com/craigerl/aprsd/commit/934ebd236d044625b911dd8ca45293f6c5680a68)
#### [3.5.0](https://github.com/craigerl/aprsd/compare/3.4.4...3.5.0)
> 10 January 2025
- Migrate admin web out of aprsd. [`c48ff8d`](https://github.com/craigerl/aprsd/commit/c48ff8dfd4bd4ce2f95b36e71dce13da5446a658)
- Remove webchat as a built in command. [`8f8887f`](https://github.com/craigerl/aprsd/commit/8f8887f0e496d960b0e71275893b75408a40fdb2)
- Remove email plugin [`0880a35`](https://github.com/craigerl/aprsd/commit/0880a356e6df1a0924cbf6e815e68cba5f5c6cf1)
- Fixed make clean [`ae28dbb`](https://github.com/craigerl/aprsd/commit/ae28dbb0e6bc216bf78c0bd9d7804f57b39091d1)
@ -37,7 +95,6 @@ Generated by [`auto-changelog`](https://github.com/CookPete/auto-changelog).
- Removed LocationPlugin from aprsd core [`3bba8a1`](https://github.com/craigerl/aprsd/commit/3bba8a19da88b0912064cea786bc9f8203038946)
- Include haversine library [`bbdbb9a`](https://github.com/craigerl/aprsd/commit/bbdbb9aba189d536497ea3cd7d30911fe3d9d706)
- Update Makefile [`caa4bb8`](https://github.com/craigerl/aprsd/commit/caa4bb8bd01cbd2e02024d75e1c8af97acf6c657)
- Enable packet stats for listen command in Docker [`e5d8796`](https://github.com/craigerl/aprsd/commit/e5d8796cda1a007aa868c760b96b50b364351519)
- Added new KeepAliveCollector [`30d1eb5`](https://github.com/craigerl/aprsd/commit/30d1eb57dd249c609f5b092d8084c40cadda7bd9)
- Changed to ruff [`72d068c`](https://github.com/craigerl/aprsd/commit/72d068c0b8944c8c9eed494fc23de8d7179ee09b)
- Changed README.rst -> README.md [`b1a830d`](https://github.com/craigerl/aprsd/commit/b1a830d54e9dec473074b34f9566f161bdec0030)
@ -58,21 +115,6 @@ Generated by [`auto-changelog`](https://github.com/CookPete/auto-changelog).
- Added .mailmap [`8d98546`](https://github.com/craigerl/aprsd/commit/8d9854605584fa35117af888fe219df610fb7cb4)
- updated tools in pre-commit [`e4f82d6`](https://github.com/craigerl/aprsd/commit/e4f82d6054d4d859023423bccdd5c402d7a83494)
- some cleanup [`e332d7c`](https://github.com/craigerl/aprsd/commit/e332d7c9d046066e2686ea0522ae06b86d2f162d)
- Added activity to README [`cdd297c`](https://github.com/craigerl/aprsd/commit/cdd297c5bbc8b93f4739f5850a3e5971ce8baeba)
- Added star history to readme [`02e2940`](https://github.com/craigerl/aprsd/commit/02e29405ce2f8310e4f87f68498dfd6575c2e43b)
- removed pytest from README [`1cba31f`](https://github.com/craigerl/aprsd/commit/1cba31f0ac9bd5ee532721a909fc752f023f3b06)
- Updated Docker for using alpine and uv [`24db814`](https://github.com/craigerl/aprsd/commit/24db814c82c9bb6634566d7428603bf7a9ae37d1)
- Update the admin and setup.sh for container [`044ea4c`](https://github.com/craigerl/aprsd/commit/044ea4cc9a0059101851d6e722e986ee236833e8)
- added healthcheck.sh [`1054999`](https://github.com/craigerl/aprsd/commit/10549995686b08e4c166f780efdec5bdae496cab)
- updated healthcheck.sh [`dabb48c`](https://github.com/craigerl/aprsd/commit/dabb48c6f64062c1fed8f83a4f0b8ffba0c206a5)
- try making image for webchat [`ba8acdc`](https://github.com/craigerl/aprsd/commit/ba8acdc5849fc7b2d8a1ee11af6f5e317cf30f45)
- Added APRSD logo [`0ed648f`](https://github.com/craigerl/aprsd/commit/0ed648f8f8a961dbbd9e22bcebadcde525ee41ae)
- Added plugin and extension links [`447451c`](https://github.com/craigerl/aprsd/commit/447451c6c97e1f2d3d0bf580db21ecd176690258)
- reduced logo size 50% [`cf4a29f`](https://github.com/craigerl/aprsd/commit/cf4a29f0cb3ed366b21ec3120a189614e0955180)
- Updated README.md TOC [`375a5e5`](https://github.com/craigerl/aprsd/commit/375a5e5b34718cadc6ee8a51484fc91441440a61)
- chore: update AUTHORS [skip ci] [`c556f51`](https://github.com/craigerl/aprsd/commit/c556f5126f725904822a75427475d46986f8e9f3)
- Updated requirements [`4a7a902`](https://github.com/craigerl/aprsd/commit/4a7a902a337759a352560d4d92dc314b1726412a)
- Updated ChangeLog for 4.0.0 [`934ebd2`](https://github.com/craigerl/aprsd/commit/934ebd236d044625b911dd8ca45293f6c5680a68)
#### [3.4.4](https://github.com/craigerl/aprsd/compare/3.4.3...3.4.4)

View File

@ -13,35 +13,35 @@ from aprsd.utils import trace
CONF = cfg.CONF
home = str(Path.home())
DEFAULT_CONFIG_DIR = f"{home}/.config/aprsd/"
DEFAULT_SAVE_FILE = f"{home}/.config/aprsd/aprsd.p"
DEFAULT_CONFIG_FILE = f"{home}/.config/aprsd/aprsd.conf"
DEFAULT_CONFIG_DIR = f'{home}/.config/aprsd/'
DEFAULT_SAVE_FILE = f'{home}/.config/aprsd/aprsd.p'
DEFAULT_CONFIG_FILE = f'{home}/.config/aprsd/aprsd.conf'
F = t.TypeVar("F", bound=t.Callable[..., t.Any])
F = t.TypeVar('F', bound=t.Callable[..., t.Any])
common_options = [
click.option(
"--loglevel",
default="INFO",
'--loglevel',
default='INFO',
show_default=True,
type=click.Choice(
["CRITICAL", "ERROR", "WARNING", "INFO", "DEBUG"],
['CRITICAL', 'ERROR', 'WARNING', 'INFO', 'DEBUG'],
case_sensitive=False,
),
show_choices=True,
help="The log level to use for aprsd.log",
help='The log level to use for aprsd.log',
),
click.option(
"-c",
"--config",
"config_file",
'-c',
'--config',
'config_file',
show_default=True,
default=DEFAULT_CONFIG_FILE,
help="The aprsd config file to use for options.",
help='The aprsd config file to use for options.',
),
click.option(
"--quiet",
'--quiet',
is_flag=True,
default=False,
help="Don't log to stdout",
@ -59,7 +59,7 @@ class AliasedGroup(click.Group):
"""
def decorator(f):
aliases = kwargs.pop("aliases", [])
aliases = kwargs.pop('aliases', [])
cmd = click.decorators.command(*args, **kwargs)(f)
self.add_command(cmd)
for alias in aliases:
@ -77,7 +77,7 @@ class AliasedGroup(click.Group):
"""
def decorator(f):
aliases = kwargs.pop("aliases", [])
aliases = kwargs.pop('aliases', [])
cmd = click.decorators.group(*args, **kwargs)(f)
self.add_command(cmd)
for alias in aliases:
@ -101,36 +101,37 @@ def process_standard_options(f: F) -> F:
ctx = args[0]
ctx.ensure_object(dict)
config_file_found = True
if kwargs["config_file"]:
default_config_files = [kwargs["config_file"]]
if kwargs['config_file']:
default_config_files = [kwargs['config_file']]
else:
default_config_files = None
try:
CONF(
[],
project="aprsd",
project='aprsd',
version=aprsd.__version__,
default_config_files=default_config_files,
)
except cfg.ConfigFilesNotFoundError:
config_file_found = False
ctx.obj["loglevel"] = kwargs["loglevel"]
ctx.obj['loglevel'] = kwargs['loglevel']
# ctx.obj["config_file"] = kwargs["config_file"]
ctx.obj["quiet"] = kwargs["quiet"]
ctx.obj['quiet'] = kwargs['quiet']
log.setup_logging(
ctx.obj["loglevel"],
ctx.obj["quiet"],
ctx.obj['loglevel'],
ctx.obj['quiet'],
)
if CONF.trace_enabled:
trace.setup_tracing(["method", "api"])
trace.setup_tracing(['method', 'api'])
if not config_file_found:
LOG = logging.getLogger("APRSD") # noqa: N806
LOG = logging.getLogger('APRSD') # noqa: N806
LOG.error("No config file found!! run 'aprsd sample-config'")
del kwargs["loglevel"]
del kwargs["config_file"]
del kwargs["quiet"]
del kwargs['loglevel']
del kwargs['config_file']
del kwargs['quiet']
return f(*args, **kwargs)
return update_wrapper(t.cast(F, new_func), f)
@ -142,17 +143,17 @@ def process_standard_options_no_config(f: F) -> F:
def new_func(*args, **kwargs):
ctx = args[0]
ctx.ensure_object(dict)
ctx.obj["loglevel"] = kwargs["loglevel"]
ctx.obj["config_file"] = kwargs["config_file"]
ctx.obj["quiet"] = kwargs["quiet"]
ctx.obj['loglevel'] = kwargs['loglevel']
ctx.obj['config_file'] = kwargs['config_file']
ctx.obj['quiet'] = kwargs['quiet']
log.setup_logging(
ctx.obj["loglevel"],
ctx.obj["quiet"],
ctx.obj['loglevel'],
ctx.obj['quiet'],
)
del kwargs["loglevel"]
del kwargs["config_file"]
del kwargs["quiet"]
del kwargs['loglevel']
del kwargs['config_file']
del kwargs['quiet']
return f(*args, **kwargs)
return update_wrapper(t.cast(F, new_func), f)

View File

@ -1,13 +1,5 @@
from aprsd.client import aprsis, factory, fake, kiss
TRANSPORT_APRSIS = "aprsis"
TRANSPORT_TCPKISS = "tcpkiss"
TRANSPORT_SERIALKISS = "serialkiss"
TRANSPORT_FAKE = "fake"
client_factory = factory.ClientFactory()
client_factory.register(aprsis.APRSISClient)
client_factory.register(kiss.KISSClient)
client_factory.register(fake.APRSDFakeClient)
# define the client transports here
TRANSPORT_APRSIS = 'aprsis'
TRANSPORT_TCPKISS = 'tcpkiss'
TRANSPORT_SERIALKISS = 'serialkiss'
TRANSPORT_FAKE = 'fake'

View File

@ -1,183 +0,0 @@
import datetime
import logging
import time
import timeago
from aprslib.exceptions import LoginError
from loguru import logger
from oslo_config import cfg
from aprsd import client, exception
from aprsd.client import base
from aprsd.client.drivers import aprsis
from aprsd.packets import core
CONF = cfg.CONF
LOG = logging.getLogger("APRSD")
LOGU = logger
class APRSISClient(base.APRSClient):
_client = None
_checks = False
def __init__(self):
max_timeout = {"hours": 0.0, "minutes": 2, "seconds": 0}
self.max_delta = datetime.timedelta(**max_timeout)
def stats(self, serializable=False) -> dict:
stats = {}
if self.is_configured():
if self._client:
keepalive = self._client.aprsd_keepalive
server_string = self._client.server_string
if serializable:
keepalive = keepalive.isoformat()
else:
keepalive = "None"
server_string = "None"
stats = {
"connected": self.is_connected,
"filter": self.filter,
"login_status": self.login_status,
"connection_keepalive": keepalive,
"server_string": server_string,
"transport": self.transport(),
}
return stats
def keepalive_check(self):
# Don't check the first time through.
if not self.is_alive() and self._checks:
LOG.warning("Resetting client. It's not alive.")
self.reset()
self._checks = True
def keepalive_log(self):
if ka := self._client.aprsd_keepalive:
keepalive = timeago.format(ka)
else:
keepalive = "N/A"
LOGU.opt(colors=True).info(f"<green>Client keepalive {keepalive}</green>")
@staticmethod
def is_enabled():
# Defaults to True if the enabled flag is non existent
try:
return CONF.aprs_network.enabled
except KeyError:
return False
@staticmethod
def is_configured():
if APRSISClient.is_enabled():
# Ensure that the config vars are correctly set
if not CONF.aprs_network.login:
LOG.error("Config aprs_network.login not set.")
raise exception.MissingConfigOptionException(
"aprs_network.login is not set.",
)
if not CONF.aprs_network.password:
LOG.error("Config aprs_network.password not set.")
raise exception.MissingConfigOptionException(
"aprs_network.password is not set.",
)
if not CONF.aprs_network.host:
LOG.error("Config aprs_network.host not set.")
raise exception.MissingConfigOptionException(
"aprs_network.host is not set.",
)
return True
return True
def _is_stale_connection(self):
delta = datetime.datetime.now() - self._client.aprsd_keepalive
if delta > self.max_delta:
LOG.error(f"Connection is stale, last heard {delta} ago.")
return True
return False
def is_alive(self):
if not self._client:
LOG.warning(f"APRS_CLIENT {self._client} alive? NO!!!")
return False
return self._client.is_alive() and not self._is_stale_connection()
def close(self):
if self._client:
self._client.stop()
self._client.close()
@staticmethod
def transport():
return client.TRANSPORT_APRSIS
def decode_packet(self, *args, **kwargs):
"""APRS lib already decodes this."""
return core.factory(args[0])
def setup_connection(self):
user = CONF.aprs_network.login
password = CONF.aprs_network.password
host = CONF.aprs_network.host
port = CONF.aprs_network.port
self.connected = False
backoff = 1
aprs_client = None
retries = 3
retry_count = 0
while not self.connected:
retry_count += 1
if retry_count >= retries:
break
try:
LOG.info(
f"Creating aprslib client({host}:{port}) and logging in {user}."
)
aprs_client = aprsis.Aprsdis(
user, passwd=password, host=host, port=port
)
# Force the log to be the same
aprs_client.logger = LOG
aprs_client.connect()
self.connected = self.login_status["success"] = True
self.login_status["message"] = aprs_client.server_string
backoff = 1
except LoginError as e:
LOG.error(f"Failed to login to APRS-IS Server '{e}'")
self.connected = self.login_status["success"] = False
self.login_status["message"] = e.message
LOG.error(e.message)
time.sleep(backoff)
except Exception as e:
LOG.error(f"Unable to connect to APRS-IS server. '{e}' ")
self.connected = self.login_status["success"] = False
self.login_status["message"] = e.message
time.sleep(backoff)
# Don't allow the backoff to go to inifinity.
if backoff > 5:
backoff = 5
else:
backoff += 1
continue
self._client = aprs_client
return aprs_client
def consumer(self, callback, blocking=False, immortal=False, raw=False):
if self._client:
try:
self._client.consumer(
callback,
blocking=blocking,
immortal=immortal,
raw=raw,
)
except Exception as e:
LOG.error(e)
LOG.info(e.__cause__)
raise e
else:
LOG.warning("client is None, might be resetting.")
self.connected = False

View File

@ -1,153 +0,0 @@
import abc
import logging
import threading
import wrapt
from oslo_config import cfg
from aprsd.packets import core
from aprsd.utils import keepalive_collector
CONF = cfg.CONF
LOG = logging.getLogger("APRSD")
class APRSClient:
"""Singleton client class that constructs the aprslib connection."""
_instance = None
_client = None
connected = False
login_status = {
"success": False,
"message": None,
}
filter = None
lock = threading.Lock()
def __new__(cls, *args, **kwargs):
"""This magic turns this into a singleton."""
if cls._instance is None:
cls._instance = super().__new__(cls)
keepalive_collector.KeepAliveCollector().register(cls)
# Put any initialization here.
cls._instance._create_client()
return cls._instance
@abc.abstractmethod
def stats(self) -> dict:
"""Return statistics about the client connection.
Returns:
dict: Statistics about the connection and packet handling
"""
@abc.abstractmethod
def keepalive_check(self) -> None:
"""Called during keepalive run to check status."""
...
@abc.abstractmethod
def keepalive_log(self) -> None:
"""Log any keepalive information."""
...
@property
def is_connected(self):
return self.connected
@property
def login_success(self):
return self.login_status.get("success", False)
@property
def login_failure(self):
return self.login_status["message"]
def set_filter(self, filter):
self.filter = filter
if self._client:
self._client.set_filter(filter)
@property
def client(self):
if not self._client:
self._create_client()
return self._client
def _create_client(self):
try:
self._client = self.setup_connection()
if self.filter:
LOG.info("Creating APRS client filter")
self._client.set_filter(self.filter)
except Exception as e:
LOG.error(f"Failed to create APRS client: {e}")
self._client = None
raise
def stop(self):
if self._client:
LOG.info("Stopping client connection.")
self._client.stop()
def send(self, packet: core.Packet) -> None:
"""Send a packet to the network.
Args:
packet: The APRS packet to send
"""
self.client.send(packet)
@wrapt.synchronized(lock)
def reset(self) -> None:
"""Call this to force a rebuild/reconnect."""
LOG.info("Resetting client connection.")
if self._client:
self._client.close()
del self._client
self._create_client()
else:
LOG.warning("Client not initialized, nothing to reset.")
# Recreate the client
LOG.info(f"Creating new client {self.client}")
@abc.abstractmethod
def setup_connection(self):
"""Initialize and return the underlying APRS connection.
Returns:
object: The initialized connection object
"""
@staticmethod
@abc.abstractmethod
def is_enabled():
pass
@staticmethod
@abc.abstractmethod
def transport():
pass
@abc.abstractmethod
def decode_packet(self, *args, **kwargs):
"""Decode raw APRS packet data into a Packet object.
Returns:
Packet: Decoded APRS packet
"""
@abc.abstractmethod
def consumer(self, callback, blocking=False, immortal=False, raw=False):
pass
@abc.abstractmethod
def is_alive(self):
pass
@abc.abstractmethod
def close(self):
pass

141
aprsd/client/client.py Normal file
View File

@ -0,0 +1,141 @@
import logging
import threading
from typing import Callable
import timeago
import wrapt
from loguru import logger
from oslo_config import cfg
from aprsd.client import drivers # noqa - ensure drivers are registered
from aprsd.client.drivers.registry import DriverRegistry
from aprsd.packets import core
from aprsd.utils import keepalive_collector
CONF = cfg.CONF
LOG = logging.getLogger('APRSD')
LOGU = logger
class APRSDClient:
"""APRSD client class.
This is a singleton class that provides a single instance of the APRSD client.
It is responsible for connecting to the appropriate APRSD client driver based on
the configuration.
"""
_instance = None
driver = None
lock = threading.Lock()
filter = None
def __new__(cls, *args, **kwargs):
"""This magic turns this into a singleton."""
if cls._instance is None:
cls._instance = super().__new__(cls)
keepalive_collector.KeepAliveCollector().register(cls)
return cls._instance
def __init__(self):
self.connected = False
self.login_status = {
'success': False,
'message': None,
}
if not self.driver:
self.driver = DriverRegistry().get_driver()
self.driver.setup_connection()
def stats(self, serializable=False) -> dict:
stats = {}
if self.driver:
stats = self.driver.stats(serializable=serializable)
return stats
@property
def is_enabled(self):
if not self.driver:
return False
return self.driver.is_enabled()
@property
def is_configured(self):
if not self.driver:
return False
return self.driver.is_configured()
# @property
# def is_connected(self):
# if not self.driver:
# return False
# return self.driver.is_connected()
@property
def login_success(self):
if not self.driver:
return False
return self.driver.login_success
@property
def login_failure(self):
if not self.driver:
return None
return self.driver.login_failure
def set_filter(self, filter):
self.filter = filter
if not self.driver:
return
self.driver.set_filter(filter)
def get_filter(self):
if not self.driver:
return None
return self.driver.filter
def is_alive(self):
return self.driver.is_alive()
def close(self):
if not self.driver:
return
self.driver.close()
@wrapt.synchronized(lock)
def reset(self):
"""Call this to force a rebuild/reconnect."""
LOG.info('Resetting client connection.')
if self.driver:
self.driver.close()
self.driver.setup_connection()
if self.filter:
self.driver.set_filter(self.filter)
else:
LOG.warning('Client not initialized, nothing to reset.')
def send(self, packet: core.Packet) -> bool:
return self.driver.send(packet)
# For the keepalive collector
def keepalive_check(self):
# Don't check the first time through.
if not self.driver.is_alive and self._checks:
LOG.warning("Resetting client. It's not alive.")
self.reset()
self._checks = True
# For the keepalive collector
def keepalive_log(self):
if ka := self.driver.keepalive:
keepalive = timeago.format(ka)
else:
keepalive = 'N/A'
LOGU.opt(colors=True).info(f'<green>Client keepalive {keepalive}</green>')
def consumer(self, callback: Callable, raw: bool = False):
return self.driver.consumer(callback=callback, raw=raw)
def decode_packet(self, *args, **kwargs) -> core.Packet:
return self.driver.decode_packet(*args, **kwargs)

View File

@ -0,0 +1,10 @@
# All client drivers must be registered here
from aprsd.client.drivers.aprsis import APRSISDriver
from aprsd.client.drivers.fake import APRSDFakeDriver
from aprsd.client.drivers.registry import DriverRegistry
from aprsd.client.drivers.tcpkiss import TCPKISSDriver
driver_registry = DriverRegistry()
driver_registry.register(APRSDFakeDriver)
driver_registry.register(APRSISDriver)
driver_registry.register(TCPKISSDriver)

View File

@ -1,234 +1,205 @@
import datetime
import logging
import select
import threading
import time
from typing import Callable
import aprslib
import wrapt
from aprslib import is_py3
from aprslib.exceptions import (
ConnectionDrop,
ConnectionError,
GenericError,
LoginError,
ParseError,
UnknownFormat,
)
from aprslib.exceptions import LoginError
from loguru import logger
from oslo_config import cfg
import aprsd
from aprsd import client, exception
from aprsd.client.drivers.lib.aprslib import APRSLibClient
from aprsd.packets import core
LOG = logging.getLogger("APRSD")
CONF = cfg.CONF
LOG = logging.getLogger('APRSD')
LOGU = logger
class Aprsdis(aprslib.IS):
"""Extend the aprslib class so we can exit properly."""
# class APRSISDriver(metaclass=trace.TraceWrapperMetaclass):
class APRSISDriver:
"""This is the APRS-IS driver for the APRSD client.
# flag to tell us to stop
thread_stop = False
This driver uses our modified aprslib.IS class to connect to the APRS-IS server.
# date for last time we heard from the server
aprsd_keepalive = datetime.datetime.now()
"""
# Which server we are connected to?
server_string = "None"
_client = None
_checks = False
# timeout in seconds
select_timeout = 1
lock = threading.Lock()
def __init__(self):
max_timeout = {'hours': 0.0, 'minutes': 2, 'seconds': 0}
self.max_delta = datetime.timedelta(**max_timeout)
self.login_status = {
'success': False,
'message': None,
}
def stop(self):
self.thread_stop = True
LOG.warning("Shutdown Aprsdis client.")
@staticmethod
def is_enabled():
# Defaults to True if the enabled flag is non existent
try:
return CONF.aprs_network.enabled
except KeyError:
return False
@staticmethod
def is_configured():
if APRSISDriver.is_enabled():
# Ensure that the config vars are correctly set
if not CONF.aprs_network.login:
LOG.error('Config aprs_network.login not set.')
raise exception.MissingConfigOptionException(
'aprs_network.login is not set.',
)
if not CONF.aprs_network.password:
LOG.error('Config aprs_network.password not set.')
raise exception.MissingConfigOptionException(
'aprs_network.password is not set.',
)
if not CONF.aprs_network.host:
LOG.error('Config aprs_network.host not set.')
raise exception.MissingConfigOptionException(
'aprs_network.host is not set.',
)
return True
return True
@property
def is_alive(self):
if not self._client:
LOG.warning(f'APRS_CLIENT {self._client} alive? NO!!!')
return False
return self._client.is_alive() and not self._is_stale_connection()
def close(self):
LOG.warning("Closing Aprsdis client.")
super().close()
if self._client:
self._client.stop()
self._client.close()
@wrapt.synchronized(lock)
def send(self, packet: core.Packet):
"""Send an APRS Message object."""
self.sendall(packet.raw)
def send(self, packet: core.Packet) -> bool:
return self._client.send(packet)
def is_alive(self):
"""If the connection is alive or not."""
return self._connected
def _socket_readlines(self, blocking=False):
"""
Generator for complete lines, received from the server
"""
try:
self.sock.setblocking(0)
except OSError as e:
self.logger.error(f"socket error when setblocking(0): {str(e)}")
raise aprslib.ConnectionDrop("connection dropped")
while not self.thread_stop:
short_buf = b""
newline = b"\r\n"
# set a select timeout, so we get a chance to exit
# when user hits CTRL-C
readable, writable, exceptional = select.select(
[self.sock],
[],
[],
self.select_timeout,
)
if not readable:
if not blocking:
break
else:
continue
try:
short_buf = self.sock.recv(4096)
# sock.recv returns empty if the connection drops
if not short_buf:
if not blocking:
# We could just not be blocking, so empty is expected
continue
else:
self.logger.error("socket.recv(): returned empty")
raise aprslib.ConnectionDrop("connection dropped")
except OSError as e:
# self.logger.error("socket error on recv(): %s" % str(e))
if "Resource temporarily unavailable" in str(e):
if not blocking:
if len(self.buf) == 0:
break
self.buf += short_buf
while newline in self.buf:
line, self.buf = self.buf.split(newline, 1)
yield line
def _send_login(self):
"""
Sends login string to server
"""
login_str = "user {0} pass {1} vers github.com/craigerl/aprsd {3}{2}\r\n"
login_str = login_str.format(
self.callsign,
self.passwd,
(" filter " + self.filter) if self.filter != "" else "",
aprsd.__version__,
)
self.logger.debug("Sending login information")
try:
self._sendall(login_str)
self.sock.settimeout(5)
test = self.sock.recv(len(login_str) + 100)
if is_py3:
test = test.decode("latin-1")
test = test.rstrip()
self.logger.debug("Server: '%s'", test)
if not test:
raise LoginError(f"Server Response Empty: '{test}'")
_, _, callsign, status, e = test.split(" ", 4)
s = e.split(",")
if len(s):
server_string = s[0].replace("server ", "")
else:
server_string = e.replace("server ", "")
if callsign == "":
raise LoginError("Server responded with empty callsign???")
if callsign != self.callsign:
raise LoginError(f"Server: {test}")
if status != "verified," and self.passwd != "-1":
raise LoginError("Password is incorrect")
if self.passwd == "-1":
self.logger.info("Login successful (receive only)")
else:
self.logger.info("Login successful")
self.logger.info(f"Connected to {server_string}")
self.server_string = server_string
except LoginError as e:
self.logger.error(str(e))
self.close()
raise
except Exception as e:
self.close()
self.logger.error(f"Failed to login '{e}'")
self.logger.exception(e)
raise LoginError("Failed to login")
def consumer(self, callback, blocking=True, immortal=False, raw=False):
"""
When a position sentence is received, it will be passed to the callback function
blocking: if true (default), runs forever, otherwise will return after one sentence
You can still exit the loop, by raising StopIteration in the callback function
immortal: When true, consumer will try to reconnect and stop propagation of Parse exceptions
if false (default), consumer will return
raw: when true, raw packet is passed to callback, otherwise the result from aprs.parse()
"""
if not self._connected:
raise ConnectionError("not connected to a server")
line = b""
while True and not self.thread_stop:
try:
for line in self._socket_readlines(blocking):
if line[0:1] != b"#":
self.aprsd_keepalive = datetime.datetime.now()
if raw:
callback(line)
else:
callback(self._parse(line))
else:
self.logger.debug("Server: %s", line.decode("utf8"))
self.aprsd_keepalive = datetime.datetime.now()
except ParseError as exp:
self.logger.log(
11,
"%s Packet: '%s'",
exp,
exp.packet,
)
except UnknownFormat as exp:
self.logger.log(
9,
"%s Packet: '%s'",
exp,
exp.packet,
)
except LoginError as exp:
self.logger.error("%s: %s", exp.__class__.__name__, exp)
except (KeyboardInterrupt, SystemExit):
raise
except (ConnectionDrop, ConnectionError):
self.close()
if not immortal:
raise
else:
self.connect(blocking=blocking)
continue
except GenericError:
pass
except StopIteration:
def setup_connection(self):
user = CONF.aprs_network.login
password = CONF.aprs_network.password
host = CONF.aprs_network.host
port = CONF.aprs_network.port
self.connected = False
backoff = 1
retries = 3
retry_count = 0
while not self.connected:
retry_count += 1
if retry_count >= retries:
break
except Exception:
self.logger.error("APRS Packet: %s", line)
raise
try:
LOG.info(
f'Creating aprslib client({host}:{port}) and logging in {user}.'
)
self._client = APRSLibClient(
user, passwd=password, host=host, port=port
)
# Force the log to be the same
self._client.logger = LOG
self._client.connect()
self.connected = self.login_status['success'] = True
self.login_status['message'] = self._client.server_string
backoff = 1
except LoginError as e:
LOG.error(f"Failed to login to APRS-IS Server '{e}'")
self.connected = self.login_status['success'] = False
self.login_status['message'] = (
e.message if hasattr(e, 'message') else str(e)
)
LOG.error(self.login_status['message'])
time.sleep(backoff)
except Exception as e:
LOG.error(f"Unable to connect to APRS-IS server. '{e}' ")
self.connected = self.login_status['success'] = False
self.login_status['message'] = getattr(e, 'message', str(e))
time.sleep(backoff)
# Don't allow the backoff to go to inifinity.
if backoff > 5:
backoff = 5
else:
backoff += 1
continue
if not blocking:
break
def set_filter(self, filter):
self._client.set_filter(filter)
def login_success(self) -> bool:
return self.login_status.get('success', False)
def login_failure(self) -> str:
return self.login_status.get('message', None)
@property
def filter(self):
return self._client.filter
@property
def server_string(self):
return self._client.server_string
@property
def keepalive(self):
return self._client.aprsd_keepalive
def _is_stale_connection(self):
delta = datetime.datetime.now() - self._client.aprsd_keepalive
if delta > self.max_delta:
LOG.error(f'Connection is stale, last heard {delta} ago.')
return True
return False
@staticmethod
def transport():
return client.TRANSPORT_APRSIS
def decode_packet(self, *args, **kwargs):
"""APRS lib already decodes this."""
return core.factory(args[0])
def consumer(self, callback: Callable, raw: bool = False):
if self._client:
try:
self._client.consumer(
callback,
blocking=False,
immortal=False,
raw=raw,
)
except Exception as e:
LOG.error(e)
LOG.info(e.__cause__)
raise e
else:
LOG.warning('client is None, might be resetting.')
self.connected = False
def stats(self, serializable=False) -> dict:
stats = {}
if self.is_configured():
if self._client:
keepalive = self._client.aprsd_keepalive
server_string = self._client.server_string
if serializable:
keepalive = keepalive.isoformat()
filter = self.filter
else:
keepalive = 'None'
server_string = 'None'
filter = 'None'
stats = {
'connected': self.is_alive,
'filter': filter,
'login_status': self.login_status,
'connection_keepalive': keepalive,
'server_string': server_string,
'transport': self.transport(),
}
return stats

View File

@ -2,6 +2,7 @@ import datetime
import logging
import threading
import time
from typing import Callable
import aprslib
import wrapt
@ -15,7 +16,7 @@ CONF = cfg.CONF
LOG = logging.getLogger('APRSD')
class APRSDFakeClient(metaclass=trace.TraceWrapperMetaclass):
class APRSDFakeDriver(metaclass=trace.TraceWrapperMetaclass):
"""Fake client for testing."""
# flag to tell us to stop
@ -28,17 +29,40 @@ class APRSDFakeClient(metaclass=trace.TraceWrapperMetaclass):
path = []
def __init__(self):
LOG.info('Starting APRSDFakeClient client.')
LOG.info('Starting APRSDFakeDriver driver.')
self.path = ['WIDE1-1', 'WIDE2-1']
def stop(self):
self.thread_stop = True
LOG.info('Shutdown APRSDFakeClient client.')
@staticmethod
def is_enabled():
if CONF.fake_client.enabled:
return True
return False
@staticmethod
def is_configured():
return APRSDFakeDriver.is_enabled
def is_alive(self):
"""If the connection is alive or not."""
return not self.thread_stop
def close(self):
self.thread_stop = True
LOG.info('Shutdown APRSDFakeDriver driver.')
def setup_connection(self):
# It's fake....
pass
def set_filter(self, filter: str) -> None:
pass
def login_success(self) -> bool:
return True
def login_failure(self) -> str:
return None
@wrapt.synchronized(lock)
def send(self, packet: core.Packet):
"""Send an APRS Message object."""
@ -61,13 +85,37 @@ class APRSDFakeClient(metaclass=trace.TraceWrapperMetaclass):
f'\'{packet.from_call}\' with PATH "{self.path}"',
)
def consumer(self, callback, blocking=False, immortal=False, raw=False):
def consumer(self, callback: Callable, raw: bool = False):
LOG.debug('Start non blocking FAKE consumer')
# Generate packets here?
raw = 'GTOWN>APDW16,WIDE1-1,WIDE2-1:}KM6LYW-9>APZ100,TCPIP,GTOWN*::KM6LYW :KM6LYW: 19 Miles SW'
pkt_raw = aprslib.parse(raw)
pkt = core.factory(pkt_raw)
raw_str = 'GTOWN>APDW16,WIDE1-1,WIDE2-1:}KM6LYW-9>APZ100,TCPIP,GTOWN*::KM6LYW :KM6LYW: 19 Miles SW'
self.aprsd_keepalive = datetime.datetime.now()
callback(packet=pkt)
if raw:
callback(raw=raw_str)
else:
pkt_raw = aprslib.parse(raw_str)
pkt = core.factory(pkt_raw)
callback(packet=pkt)
LOG.debug(f'END blocking FAKE consumer {self}')
time.sleep(8)
time.sleep(1)
def decode_packet(self, *args, **kwargs):
"""APRS lib already decodes this."""
if not kwargs:
return None
if kwargs.get('packet'):
return kwargs.get('packet')
if kwargs.get('raw'):
pkt_raw = aprslib.parse(kwargs.get('raw'))
pkt = core.factory(pkt_raw)
return pkt
def stats(self, serializable: bool = False) -> dict:
return {
'driver': self.__class__.__name__,
'is_alive': self.is_alive(),
'transport': 'fake',
}

View File

@ -1,121 +0,0 @@
import datetime
import logging
import kiss
from ax253 import Frame
from oslo_config import cfg
from aprsd import conf # noqa
from aprsd.packets import core
from aprsd.utils import trace
CONF = cfg.CONF
LOG = logging.getLogger('APRSD')
class KISS3Client:
path = []
# date for last time we heard from the server
aprsd_keepalive = datetime.datetime.now()
def __init__(self):
self.setup()
def is_alive(self):
return True
def setup(self):
# we can be TCP kiss or Serial kiss
if CONF.kiss_serial.enabled:
LOG.debug(
'KISS({}) Serial connection to {}'.format(
kiss.__version__,
CONF.kiss_serial.device,
),
)
self.kiss = kiss.SerialKISS(
port=CONF.kiss_serial.device,
speed=CONF.kiss_serial.baudrate,
strip_df_start=True,
)
self.path = CONF.kiss_serial.path
elif CONF.kiss_tcp.enabled:
LOG.debug(
'KISS({}) TCP Connection to {}:{}'.format(
kiss.__version__,
CONF.kiss_tcp.host,
CONF.kiss_tcp.port,
),
)
self.kiss = kiss.TCPKISS(
host=CONF.kiss_tcp.host,
port=CONF.kiss_tcp.port,
strip_df_start=True,
)
self.path = CONF.kiss_tcp.path
LOG.debug('Starting KISS interface connection')
self.kiss.start()
@trace.trace
def stop(self):
try:
self.kiss.stop()
self.kiss.loop.call_soon_threadsafe(
self.kiss.protocol.transport.close,
)
except Exception as ex:
LOG.exception(ex)
def set_filter(self, filter):
# This does nothing right now.
pass
def parse_frame(self, frame_bytes):
try:
frame = Frame.from_bytes(frame_bytes)
# Now parse it with aprslib
kwargs = {
'frame': frame,
}
self._parse_callback(**kwargs)
self.aprsd_keepalive = datetime.datetime.now()
except Exception as ex:
LOG.error('Failed to parse bytes received from KISS interface.')
LOG.exception(ex)
def consumer(self, callback):
self._parse_callback = callback
self.kiss.read(callback=self.parse_frame, min_frames=None)
def send(self, packet):
"""Send an APRS Message object."""
payload = None
path = self.path
if isinstance(packet, core.Packet):
packet.prepare()
payload = packet.payload.encode('US-ASCII')
if packet.path:
path = packet.path
else:
msg_payload = f'{packet.raw}{{{str(packet.msgNo)}'
payload = (
':{:<9}:{}'.format(
packet.to_call,
msg_payload,
)
).encode('US-ASCII')
LOG.debug(
f"KISS Send '{payload}' TO '{packet.to_call}' From "
f"'{packet.from_call}' with PATH '{path}'",
)
frame = Frame.ui(
destination='APZ100',
source=packet.from_call,
path=path,
info=payload,
)
self.kiss.write(frame)

View File

View File

@ -0,0 +1,296 @@
import datetime
import logging
import select
import socket
import threading
import aprslib
import wrapt
from aprslib import is_py3
from aprslib.exceptions import (
ConnectionDrop,
ConnectionError,
GenericError,
LoginError,
ParseError,
UnknownFormat,
)
import aprsd
from aprsd.packets import core
LOG = logging.getLogger('APRSD')
class APRSLibClient(aprslib.IS):
"""Extend the aprslib class so we can exit properly.
This is a modified version of the aprslib.IS class that adds a stop method to
allow the client to exit cleanly.
The aprsis driver uses this class to connect to the APRS-IS server.
"""
# flag to tell us to stop
thread_stop = False
# date for last time we heard from the server
aprsd_keepalive = datetime.datetime.now()
# Which server we are connected to?
server_string = 'None'
# timeout in seconds
select_timeout = 1
lock = threading.Lock()
def stop(self):
self.thread_stop = True
LOG.warning('Shutdown Aprsdis client.')
def close(self):
LOG.warning('Closing Aprsdis client.')
super().close()
@wrapt.synchronized(lock)
def send(self, packet: core.Packet):
"""Send an APRS Message object."""
self.sendall(packet.raw)
def is_alive(self):
"""If the connection is alive or not."""
return self._connected
def _connect(self):
"""
Attemps connection to the server
"""
self.logger.info(
'Attempting connection to %s:%s', self.server[0], self.server[1]
)
try:
self._open_socket()
peer = self.sock.getpeername()
self.logger.info('Connected to %s', str(peer))
# 5 second timeout to receive server banner
self.sock.setblocking(1)
self.sock.settimeout(5)
self.sock.setsockopt(socket.SOL_SOCKET, socket.SO_KEEPALIVE, 1)
# MACOS doesn't have TCP_KEEPIDLE
if hasattr(socket, 'TCP_KEEPIDLE'):
self.sock.setsockopt(socket.IPPROTO_TCP, socket.TCP_KEEPIDLE, 1)
self.sock.setsockopt(socket.IPPROTO_TCP, socket.TCP_KEEPINTVL, 3)
self.sock.setsockopt(socket.IPPROTO_TCP, socket.TCP_KEEPCNT, 5)
banner = self.sock.recv(512)
if is_py3:
banner = banner.decode('latin-1')
if banner[0] == '#':
self.logger.debug('Banner: %s', banner.rstrip())
else:
raise ConnectionError('invalid banner from server')
except ConnectionError as e:
self.logger.error(str(e))
self.close()
raise
except (socket.error, socket.timeout) as e:
self.close()
self.logger.error('Socket error: %s' % str(e))
if str(e) == 'timed out':
raise ConnectionError('no banner from server') from e
else:
raise ConnectionError(e) from e
self._connected = True
def _socket_readlines(self, blocking=False):
"""
Generator for complete lines, received from the server
"""
try:
self.sock.setblocking(0)
except OSError as e:
self.logger.error(f'socket error when setblocking(0): {str(e)}')
raise aprslib.ConnectionDrop('connection dropped') from e
while not self.thread_stop:
short_buf = b''
newline = b'\r\n'
# set a select timeout, so we get a chance to exit
# when user hits CTRL-C
readable, writable, exceptional = select.select(
[self.sock],
[],
[],
self.select_timeout,
)
if not readable:
if not blocking:
break
else:
continue
try:
short_buf = self.sock.recv(4096)
# sock.recv returns empty if the connection drops
if not short_buf:
if not blocking:
# We could just not be blocking, so empty is expected
continue
else:
self.logger.error('socket.recv(): returned empty')
raise aprslib.ConnectionDrop('connection dropped')
except OSError as e:
# self.logger.error("socket error on recv(): %s" % str(e))
if 'Resource temporarily unavailable' in str(e):
if not blocking:
if len(self.buf) == 0:
break
self.buf += short_buf
while newline in self.buf:
line, self.buf = self.buf.split(newline, 1)
yield line
def _send_login(self):
"""
Sends login string to server
"""
login_str = 'user {0} pass {1} vers Python-APRSD {3}{2}\r\n'
login_str = login_str.format(
self.callsign,
self.passwd,
(' filter ' + self.filter) if self.filter != '' else '',
aprsd.__version__,
)
self.logger.debug('Sending login information')
try:
self._sendall(login_str)
self.sock.settimeout(5)
test = self.sock.recv(len(login_str) + 100)
if is_py3:
test = test.decode('latin-1')
test = test.rstrip()
self.logger.debug("Server: '%s'", test)
if not test:
raise LoginError(f"Server Response Empty: '{test}'")
_, _, callsign, status, e = test.split(' ', 4)
s = e.split(',')
if len(s):
server_string = s[0].replace('server ', '')
else:
server_string = e.replace('server ', '')
if callsign == '':
raise LoginError('Server responded with empty callsign???')
if callsign != self.callsign:
raise LoginError(f'Server: {test}')
if status != 'verified,' and self.passwd != '-1':
raise LoginError('Password is incorrect')
if self.passwd == '-1':
self.logger.info('Login successful (receive only)')
else:
self.logger.info('Login successful')
self.logger.info(f'Connected to {server_string}')
self.server_string = server_string
except LoginError as e:
self.logger.error(str(e))
self.close()
raise
except Exception as e:
self.close()
self.logger.error(f"Failed to login '{e}'")
self.logger.exception(e)
raise LoginError('Failed to login') from e
def consumer(self, callback, blocking=True, immortal=False, raw=False):
"""
When a position sentence is received, it will be passed to the callback function
blocking: if true (default), runs forever, otherwise will return after one sentence
You can still exit the loop, by raising StopIteration in the callback function
immortal: When true, consumer will try to reconnect and stop propagation of Parse exceptions
if false (default), consumer will return
raw: when true, raw packet is passed to callback, otherwise the result from aprs.parse()
"""
if not self._connected:
raise ConnectionError('not connected to a server')
line = b''
while not self.thread_stop:
try:
for line in self._socket_readlines(blocking):
if line[0:1] != b'#':
self.aprsd_keepalive = datetime.datetime.now()
if raw:
callback(line)
else:
callback(self._parse(line))
else:
self.logger.debug('Server: %s', line.decode('utf8'))
self.aprsd_keepalive = datetime.datetime.now()
except ParseError as exp:
self.logger.log(
11,
"%s Packet: '%s'",
exp,
exp.packet,
)
except UnknownFormat as exp:
self.logger.log(
9,
"%s Packet: '%s'",
exp,
exp.packet,
)
except LoginError as exp:
self.logger.error('%s: %s', exp.__class__.__name__, exp)
except (KeyboardInterrupt, SystemExit):
raise
except (ConnectionDrop, ConnectionError):
self.close()
if not immortal:
raise
else:
self.connect(blocking=blocking)
continue
except GenericError:
pass
except StopIteration:
break
except IOError:
if not self.thread_stop:
self.logger.error('IOError')
break
except Exception:
self.logger.error('APRS Packet: %s', line)
raise
if not blocking:
break

View File

@ -0,0 +1,86 @@
from typing import Callable, Protocol, runtime_checkable
from aprsd.packets import core
from aprsd.utils import singleton, trace
@runtime_checkable
class ClientDriver(Protocol):
"""Protocol for APRSD client drivers.
This protocol defines the methods that must be
implemented by APRSD client drivers.
"""
@staticmethod
def is_enabled(self) -> bool:
pass
@staticmethod
def is_configured(self) -> bool:
pass
def is_alive(self) -> bool:
pass
def close(self) -> None:
pass
def send(self, packet: core.Packet) -> bool:
pass
def setup_connection(self) -> None:
pass
def set_filter(self, filter: str) -> None:
pass
def login_success(self) -> bool:
pass
def login_failure(self) -> str:
pass
def consumer(self, callback: Callable, raw: bool = False) -> None:
pass
def decode_packet(self, *args, **kwargs) -> core.Packet:
pass
def stats(self, serializable: bool = False) -> dict:
pass
@singleton
class DriverRegistry(metaclass=trace.TraceWrapperMetaclass):
"""Registry for APRSD client drivers.
This registry is used to register and unregister APRSD client drivers.
This allows us to dynamically load the configured driver at runtime.
All drivers are registered, then when aprsd needs the client, the
registry provides the configured driver for the single instance of the
single APRSD client.
"""
def __init__(self):
self.drivers = []
def register(self, driver: Callable):
if not isinstance(driver, ClientDriver):
raise ValueError('Driver must be of ClientDriver type')
self.drivers.append(driver)
def unregister(self, driver: Callable):
if driver in self.drivers:
self.drivers.remove(driver)
else:
raise ValueError(f'Driver {driver} not found')
def get_driver(self) -> ClientDriver:
"""Get the first enabled driver."""
for driver in self.drivers:
if driver.is_enabled() and driver.is_configured():
return driver()
raise ValueError('No enabled driver found')

View File

@ -0,0 +1,408 @@
"""
APRSD KISS Client Driver using native KISS implementation.
This module provides a KISS client driver for APRSD using the new
non-asyncio KISSInterface implementation.
"""
import datetime
import logging
import select
import socket
import time
from typing import Any, Callable, Dict
import aprslib
from ax253 import frame as ax25frame
from kiss import constants as kiss_constants
from kiss import util as kissutil
from kiss.kiss import Command
from oslo_config import cfg
from aprsd import ( # noqa
client,
conf, # noqa
exception,
)
from aprsd.packets import core
CONF = cfg.CONF
LOG = logging.getLogger('APRSD')
def handle_fend(buffer: bytes, strip_df_start: bool = True) -> bytes:
"""
Handle FEND (end of frame) encountered in a KISS data stream.
:param buffer: the buffer containing the frame
:param strip_df_start: remove leading null byte (DATA_FRAME opcode)
:return: the bytes of the frame without escape characters or frame
end markers (FEND)
"""
frame = kissutil.recover_special_codes(kissutil.strip_nmea(bytes(buffer)))
if strip_df_start:
frame = kissutil.strip_df_start(frame)
LOG.warning(f'handle_fend {" ".join(f"{b:02X}" for b in bytes(frame))}')
return bytes(frame)
# class TCPKISSDriver(metaclass=trace.TraceWrapperMetaclass):
class TCPKISSDriver:
"""APRSD client driver for TCP KISS connections."""
# Class level attributes required by Client protocol
packets_received = 0
packets_sent = 0
last_packet_sent = None
last_packet_received = None
keepalive = None
client_name = None
socket = None
# timeout in seconds
select_timeout = 1
path = None
def __init__(self):
"""Initialize the KISS client.
Args:
client_name: Name of the client instance
"""
super().__init__()
self._connected = False
self.keepalive = datetime.datetime.now()
self._running = False
# This is initialized in setup_connection()
self.socket = None
@property
def transport(self) -> str:
return client.TRANSPORT_TCPKISS
@classmethod
def is_enabled(cls) -> bool:
"""Check if KISS is enabled in configuration.
Returns:
bool: True if either TCP is enabled
"""
return CONF.kiss_tcp.enabled
@staticmethod
def is_configured():
# Ensure that the config vars are correctly set
if TCPKISSDriver.is_enabled():
if not CONF.kiss_tcp.host:
LOG.error('KISS TCP enabled, but no host is set.')
raise exception.MissingConfigOptionException(
'kiss_tcp.host is not set.',
)
return True
return False
@property
def is_alive(self) -> bool:
"""Check if the client is connected.
Returns:
bool: True if connected to KISS TNC, False otherwise
"""
return self._connected
def close(self):
"""Close the connection."""
self.stop()
def send(self, packet: core.Packet):
"""Send an APRS packet.
Args:
packet: APRS packet to send (Packet or Message object)
Raises:
Exception: If not connected or send fails
"""
if not self.socket:
raise Exception('KISS interface not initialized')
payload = None
path = self.path
packet.prepare()
payload = packet.payload.encode('US-ASCII')
if packet.path:
path = packet.path
LOG.debug(
f"KISS Send '{payload}' TO '{packet.to_call}' From "
f"'{packet.from_call}' with PATH '{path}'",
)
frame = ax25frame.Frame.ui(
destination='APZ100',
# destination=packet.to_call,
source=packet.from_call,
path=path,
info=payload,
)
# now escape the frame special characters
frame_escaped = kissutil.escape_special_codes(bytes(frame))
# and finally wrap the frame in KISS protocol
command = Command.DATA_FRAME
frame_kiss = b''.join(
[kiss_constants.FEND, command.value, frame_escaped, kiss_constants.FEND]
)
self.socket.send(frame_kiss)
# Update last packet sent time
self.last_packet_sent = datetime.datetime.now()
# Increment packets sent counter
self.packets_sent += 1
def setup_connection(self):
"""Set up the KISS interface."""
if not self.is_enabled():
LOG.error('KISS is not enabled in configuration')
return
try:
# Configure for TCP KISS
if self.is_enabled():
LOG.info(
f'KISS TCP Connection to {CONF.kiss_tcp.host}:{CONF.kiss_tcp.port}'
)
self.path = CONF.kiss_tcp.path
self.connect()
if self._connected:
LOG.info('KISS interface initialized')
else:
LOG.error('Failed to connect to KISS interface')
except Exception as ex:
LOG.error('Failed to initialize KISS interface')
LOG.exception(ex)
self._connected = False
def set_filter(self, filter_text: str):
"""Set packet filter (not implemented for KISS).
Args:
filter_text: Filter specification (ignored for KISS)
"""
# KISS doesn't support filtering at the TNC level
pass
@property
def filter(self) -> str:
"""Get packet filter (not implemented for KISS).
Returns:
str: Empty string (not implemented for KISS)
"""
return ''
def login_success(self) -> bool:
"""There is no login for KISS."""
if not self._connected:
return False
return True
def login_failure(self) -> str:
"""There is no login for KISS."""
return 'Login successful'
def consumer(self, callback: Callable, raw: bool = False):
"""Start consuming frames with the given callback.
Args:
callback: Function to call with received packets
Raises:
Exception: If not connected to KISS TNC
"""
self._running = True
while self._running:
# Ensure connection
if not self._connected:
if not self.connect():
time.sleep(1)
continue
# Read frame
frame = self.read_frame()
if frame:
LOG.warning(f'GOT FRAME: {frame} calling {callback}')
kwargs = {
'frame': frame,
}
callback(**kwargs)
def decode_packet(self, *args, **kwargs) -> core.Packet:
"""Decode a packet from an AX.25 frame.
Args:
frame: Received AX.25 frame
"""
frame = kwargs.get('frame')
if not frame:
LOG.warning('No frame received to decode?!?!')
return None
LOG.warning(f'FRAME: {str(frame)}')
try:
aprslib_frame = aprslib.parse(str(frame))
return core.factory(aprslib_frame)
except Exception as e:
LOG.error(f'Error decoding packet: {e}')
return None
def stop(self):
"""Stop the KISS interface."""
self._running = False
self._connected = False
if self.socket:
try:
self.socket.close()
except Exception:
pass
def stats(self, serializable: bool = False) -> Dict[str, Any]:
"""Get client statistics.
Returns:
Dict containing client statistics
"""
if serializable:
keepalive = self.keepalive.isoformat()
else:
keepalive = self.keepalive
stats = {
'client': self.__class__.__name__,
'transport': self.transport,
'connected': self._connected,
'path': self.path,
'packets_sent': self.packets_sent,
'packets_received': self.packets_received,
'last_packet_sent': self.last_packet_sent,
'last_packet_received': self.last_packet_received,
'connection_keepalive': keepalive,
'host': CONF.kiss_tcp.host,
'port': CONF.kiss_tcp.port,
}
return stats
def connect(self) -> bool:
"""Establish TCP connection to the KISS host.
Returns:
bool: True if connection successful, False otherwise
"""
try:
if self.socket:
try:
self.socket.close()
except Exception:
pass
self.socket = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
self.socket.settimeout(5.0) # 5 second timeout for connection
self.socket.connect((CONF.kiss_tcp.host, CONF.kiss_tcp.port))
self.socket.settimeout(0.1) # Reset to shorter timeout for reads
self._connected = True
self.socket.setsockopt(socket.SOL_SOCKET, socket.SO_KEEPALIVE, 1)
# MACOS doesn't have TCP_KEEPIDLE
if hasattr(socket, 'TCP_KEEPIDLE'):
self.socket.setsockopt(socket.IPPROTO_TCP, socket.TCP_KEEPIDLE, 1)
self.socket.setsockopt(socket.IPPROTO_TCP, socket.TCP_KEEPINTVL, 3)
self.socket.setsockopt(socket.IPPROTO_TCP, socket.TCP_KEEPCNT, 5)
return True
except ConnectionError as e:
LOG.error(
f'Failed to connect to {CONF.kiss_tcp.host}:{CONF.kiss_tcp.port} - {str(e)}'
)
self._connected = False
return False
except Exception as e:
LOG.error(
f'Failed to connect to {CONF.kiss_tcp.host}:{CONF.kiss_tcp.port} - {str(e)}'
)
self._connected = False
return False
def fix_raw_frame(self, raw_frame: bytes) -> bytes:
"""Fix the raw frame by recalculating the FCS."""
ax25_data = raw_frame[2:-1] # Remove KISS markers
return handle_fend(ax25_data)
def read_frame(self, blocking=False):
"""
Generator for complete lines, received from the server
"""
try:
self.socket.setblocking(0)
except OSError as e:
LOG.error(f'socket error when setblocking(0): {str(e)}')
raise aprslib.ConnectionDrop('connection dropped') from e
while self._running:
short_buf = b''
try:
readable, _, _ = select.select(
[self.socket],
[],
[],
self.select_timeout,
)
if not readable:
if not blocking:
break
else:
continue
except Exception as e:
LOG.error(f'Error in read loop: {e}')
self._connected = False
break
try:
print('reading from socket')
short_buf = self.socket.recv(1024)
print(f'short_buf: {short_buf}')
# sock.recv returns empty if the connection drops
if not short_buf:
if not blocking:
# We could just not be blocking, so empty is expected
continue
else:
self.logger.error('socket.recv(): returned empty')
raise aprslib.ConnectionDrop('connection dropped')
raw_frame = self.fix_raw_frame(short_buf)
return ax25frame.Frame.from_bytes(raw_frame)
except OSError as e:
# self.logger.error("socket error on recv(): %s" % str(e))
if 'Resource temporarily unavailable' in str(e):
if not blocking:
if len(short_buf) == 0:
break
except socket.timeout:
continue
except (KeyboardInterrupt, SystemExit):
raise
except ConnectionError:
self.close()
if not self.auto_reconnect:
raise
else:
self.connect()
continue
except StopIteration:
break
except IOError:
LOG.error('IOError')
break
except Exception as e:
LOG.error(f'Error in read loop: {e}')
self._connected = False
if not self.auto_reconnect:
break

View File

@ -1,91 +0,0 @@
import logging
from typing import Callable, Protocol, runtime_checkable
from aprsd import exception
from aprsd.packets import core
LOG = logging.getLogger("APRSD")
@runtime_checkable
class Client(Protocol):
def __init__(self):
pass
def connect(self) -> bool:
pass
def disconnect(self) -> bool:
pass
def decode_packet(self, *args, **kwargs) -> type[core.Packet]:
pass
def is_enabled(self) -> bool:
pass
def is_configured(self) -> bool:
pass
def transport(self) -> str:
pass
def send(self, message: str) -> bool:
pass
def setup_connection(self) -> None:
pass
class ClientFactory:
_instance = None
clients = []
client = None
def __new__(cls, *args, **kwargs):
"""This magic turns this into a singleton."""
if cls._instance is None:
cls._instance = super().__new__(cls)
# Put any initialization here.
return cls._instance
def __init__(self):
self.clients: list[Callable] = []
def register(self, aprsd_client: Callable):
if isinstance(aprsd_client, Client):
raise ValueError("Client must be a subclass of Client protocol")
self.clients.append(aprsd_client)
def create(self, key=None):
for client in self.clients:
if client.is_enabled():
self.client = client()
return self.client
raise Exception("No client is configured!!")
def client_exists(self):
return bool(self.client)
def is_client_enabled(self):
"""Make sure at least one client is enabled."""
enabled = False
for client in self.clients:
if client.is_enabled():
enabled = True
return enabled
def is_client_configured(self):
enabled = False
for client in self.clients:
try:
if client.is_configured():
enabled = True
except exception.MissingConfigOptionException as ex:
LOG.error(ex.message)
return False
except exception.ConfigOptionBogusDefaultException as ex:
LOG.error(ex.message)
return False
return enabled

View File

@ -1,49 +0,0 @@
import logging
from oslo_config import cfg
from aprsd import client
from aprsd.client import base
from aprsd.client.drivers import fake as fake_driver
from aprsd.utils import trace
CONF = cfg.CONF
LOG = logging.getLogger("APRSD")
class APRSDFakeClient(base.APRSClient, metaclass=trace.TraceWrapperMetaclass):
def stats(self, serializable=False) -> dict:
return {
"transport": "Fake",
"connected": True,
}
@staticmethod
def is_enabled():
if CONF.fake_client.enabled:
return True
return False
@staticmethod
def is_configured():
return APRSDFakeClient.is_enabled()
def is_alive(self):
return True
def close(self):
pass
def setup_connection(self):
self.connected = True
return fake_driver.APRSDFakeClient()
@staticmethod
def transport():
return client.TRANSPORT_FAKE
def decode_packet(self, *args, **kwargs):
LOG.debug(f"kwargs {kwargs}")
pkt = kwargs["packet"]
LOG.debug(f"Got an APRS Fake Packet '{pkt}'")
return pkt

View File

@ -1,142 +0,0 @@
import datetime
import logging
import aprslib
import timeago
from loguru import logger
from oslo_config import cfg
from aprsd import client, exception
from aprsd.client import base
from aprsd.client.drivers import kiss
from aprsd.packets import core
CONF = cfg.CONF
LOG = logging.getLogger('APRSD')
LOGU = logger
class KISSClient(base.APRSClient):
_client = None
keepalive = datetime.datetime.now()
def stats(self, serializable=False) -> dict:
stats = {}
if self.is_configured():
keepalive = self.keepalive
if serializable:
keepalive = keepalive.isoformat()
stats = {
'connected': self.is_connected,
'connection_keepalive': keepalive,
'transport': self.transport(),
}
if self.transport() == client.TRANSPORT_TCPKISS:
stats['host'] = CONF.kiss_tcp.host
stats['port'] = CONF.kiss_tcp.port
elif self.transport() == client.TRANSPORT_SERIALKISS:
stats['device'] = CONF.kiss_serial.device
return stats
@staticmethod
def is_enabled():
"""Return if tcp or serial KISS is enabled."""
if CONF.kiss_serial.enabled:
return True
if CONF.kiss_tcp.enabled:
return True
return False
@staticmethod
def is_configured():
# Ensure that the config vars are correctly set
if KISSClient.is_enabled():
transport = KISSClient.transport()
if transport == client.TRANSPORT_SERIALKISS:
if not CONF.kiss_serial.device:
LOG.error('KISS serial enabled, but no device is set.')
raise exception.MissingConfigOptionException(
'kiss_serial.device is not set.',
)
elif transport == client.TRANSPORT_TCPKISS:
if not CONF.kiss_tcp.host:
LOG.error('KISS TCP enabled, but no host is set.')
raise exception.MissingConfigOptionException(
'kiss_tcp.host is not set.',
)
return True
return False
def is_alive(self):
if self._client:
return self._client.is_alive()
else:
return False
def close(self):
if self._client:
self._client.stop()
def keepalive_check(self):
# Don't check the first time through.
if not self.is_alive() and self._checks:
LOG.warning("Resetting client. It's not alive.")
self.reset()
self._checks = True
def keepalive_log(self):
if ka := self._client.aprsd_keepalive:
keepalive = timeago.format(ka)
else:
keepalive = 'N/A'
LOGU.opt(colors=True).info(f'<green>Client keepalive {keepalive}</green>')
@staticmethod
def transport():
if CONF.kiss_serial.enabled:
return client.TRANSPORT_SERIALKISS
if CONF.kiss_tcp.enabled:
return client.TRANSPORT_TCPKISS
def decode_packet(self, *args, **kwargs):
"""We get a frame, which has to be decoded."""
LOG.debug(f'kwargs {kwargs}')
frame = kwargs['frame']
LOG.debug(f"Got an APRS Frame '{frame}'")
# try and nuke the * from the fromcall sign.
# frame.header._source._ch = False
# payload = str(frame.payload.decode())
# msg = f"{str(frame.header)}:{payload}"
# msg = frame.tnc2
# LOG.debug(f"Decoding {msg}")
try:
raw = aprslib.parse(str(frame))
packet = core.factory(raw)
if isinstance(packet, core.ThirdPartyPacket):
return packet.subpacket
else:
return packet
except Exception as ex:
LOG.error(f'Error decoding packet: {ex}')
def setup_connection(self):
try:
self._client = kiss.KISS3Client()
self.connected = self.login_status['success'] = True
except Exception as ex:
self.connected = self.login_status['success'] = False
self.login_status['message'] = str(ex)
return self._client
def consumer(self, callback, blocking=False, immortal=False, raw=False):
try:
self._client.consumer(callback)
self.keepalive = datetime.datetime.now()
except Exception as ex:
LOG.error(f'Consumer failed {ex}')
LOG.error(ex)

View File

@ -3,7 +3,7 @@ import threading
import wrapt
from oslo_config import cfg
from aprsd import client
from aprsd.client.client import APRSDClient
from aprsd.utils import singleton
CONF = cfg.CONF
@ -15,4 +15,4 @@ class APRSClientStats:
@wrapt.synchronized(lock)
def stats(self, serializable=False):
return client.client_factory.create().stats(serializable=serializable)
return APRSDClient().stats(serializable=serializable)

View File

@ -11,7 +11,6 @@ from oslo_config import cfg
from aprsd import cli_helper, conf, packets, plugin
# local imports here
from aprsd.client import base
from aprsd.main import cli
from aprsd.utils import trace
@ -97,8 +96,6 @@ def test_plugin(
if CONF.trace_enabled:
trace.setup_tracing(['method', 'api'])
base.APRSClient()
pm = plugin.PluginManager()
if load_all:
pm.setup_plugins(load_help_plugin=CONF.load_help_plugin)

View File

@ -17,11 +17,13 @@ from rich.console import Console
# local imports here
import aprsd
from aprsd import cli_helper, packets, plugin, threads, utils
from aprsd.client import client_factory
from aprsd.client.client import APRSDClient
from aprsd.main import cli
from aprsd.packets import collector as packet_collector
from aprsd.packets import core, seen_list
from aprsd.packets import log as packet_log
from aprsd.packets import seen_list
from aprsd.packets.filter import PacketFilter
from aprsd.packets.filters import dupe_filter, packet_type
from aprsd.stats import collector
from aprsd.threads import keepalive, rx
from aprsd.threads import stats as stats_thread
@ -29,7 +31,7 @@ from aprsd.threads.aprsd import APRSDThread
# setup the global logger
# log.basicConfig(level=log.DEBUG) # level=10
LOG = logging.getLogger("APRSD")
LOG = logging.getLogger('APRSD')
CONF = cfg.CONF
LOGU = logger
console = Console()
@ -37,9 +39,9 @@ console = Console()
def signal_handler(sig, frame):
threads.APRSDThreadList().stop_all()
if "subprocess" not in str(frame):
if 'subprocess' not in str(frame):
LOG.info(
"Ctrl+C, Sending all threads exit! Can take up to 10 seconds {}".format(
'Ctrl+C, Sending all threads exit! Can take up to 10 seconds {}'.format(
datetime.datetime.now(),
),
)
@ -48,90 +50,66 @@ def signal_handler(sig, frame):
collector.Collector().collect()
class APRSDListenThread(rx.APRSDRXThread):
class APRSDListenProcessThread(rx.APRSDFilterThread):
def __init__(
self,
packet_queue,
packet_filter=None,
plugin_manager=None,
enabled_plugins=[],
enabled_plugins=None,
log_packets=False,
):
super().__init__(packet_queue)
super().__init__('ListenProcThread', packet_queue)
self.packet_filter = packet_filter
self.plugin_manager = plugin_manager
if self.plugin_manager:
LOG.info(f"Plugins {self.plugin_manager.get_message_plugins()}")
LOG.info(f'Plugins {self.plugin_manager.get_message_plugins()}')
self.log_packets = log_packets
def process_packet(self, *args, **kwargs):
packet = self._client.decode_packet(*args, **kwargs)
filters = {
packets.Packet.__name__: packets.Packet,
packets.AckPacket.__name__: packets.AckPacket,
packets.BeaconPacket.__name__: packets.BeaconPacket,
packets.GPSPacket.__name__: packets.GPSPacket,
packets.MessagePacket.__name__: packets.MessagePacket,
packets.MicEPacket.__name__: packets.MicEPacket,
packets.ObjectPacket.__name__: packets.ObjectPacket,
packets.StatusPacket.__name__: packets.StatusPacket,
packets.ThirdPartyPacket.__name__: packets.ThirdPartyPacket,
packets.WeatherPacket.__name__: packets.WeatherPacket,
packets.UnknownPacket.__name__: packets.UnknownPacket,
}
def print_packet(self, packet):
if self.log_packets:
packet_log.log(packet)
if self.packet_filter:
filter_class = filters[self.packet_filter]
if isinstance(packet, filter_class):
if self.log_packets:
packet_log.log(packet)
if self.plugin_manager:
# Don't do anything with the reply
# This is the listen only command.
self.plugin_manager.run(packet)
else:
if self.log_packets:
packet_log.log(packet)
if self.plugin_manager:
# Don't do anything with the reply.
# This is the listen only command.
self.plugin_manager.run(packet)
packet_collector.PacketCollector().rx(packet)
def process_packet(self, packet: type[core.Packet]):
if self.plugin_manager:
# Don't do anything with the reply.
# This is the listen only command.
self.plugin_manager.run(packet)
class ListenStatsThread(APRSDThread):
"""Log the stats from the PacketList."""
def __init__(self):
super().__init__("PacketStatsLog")
super().__init__('PacketStatsLog')
self._last_total_rx = 0
self.period = 31
def loop(self):
if self.loop_count % 10 == 0:
if self.loop_count % self.period == 0:
# log the stats every 10 seconds
stats_json = collector.Collector().collect()
stats = stats_json["PacketList"]
total_rx = stats["rx"]
packet_count = len(stats["packets"])
stats = stats_json['PacketList']
total_rx = stats['rx']
packet_count = len(stats['packets'])
rx_delta = total_rx - self._last_total_rx
rate = rx_delta / 10
rate = rx_delta / self.period
# Log summary stats
LOGU.opt(colors=True).info(
f"<green>RX Rate: {rate} pps</green> "
f"<yellow>Total RX: {total_rx}</yellow> "
f"<red>RX Last 10 secs: {rx_delta}</red> "
f"<white>Packets in PacketList: {packet_count}</white>",
f'<green>RX Rate: {rate:.2f} pps</green> '
f'<yellow>Total RX: {total_rx}</yellow> '
f'<red>RX Last {self.period} secs: {rx_delta}</red> '
f'<white>Packets in PacketListStats: {packet_count}</white>',
)
self._last_total_rx = total_rx
# Log individual type stats
for k, v in stats["types"].items():
thread_hex = f"fg {utils.hex_from_name(k)}"
for k, v in stats['types'].items():
thread_hex = f'fg {utils.hex_from_name(k)}'
LOGU.opt(colors=True).info(
f"<{thread_hex}>{k:<15}</{thread_hex}> "
f"<blue>RX: {v['rx']}</blue> <red>TX: {v['tx']}</red>",
f'<{thread_hex}>{k:<15}</{thread_hex}> '
f'<blue>RX: {v["rx"]}</blue> <red>TX: {v["tx"]}</red>',
)
time.sleep(1)
@ -141,19 +119,19 @@ class ListenStatsThread(APRSDThread):
@cli.command()
@cli_helper.add_options(cli_helper.common_options)
@click.option(
"--aprs-login",
envvar="APRS_LOGIN",
'--aprs-login',
envvar='APRS_LOGIN',
show_envvar=True,
help="What callsign to send the message from.",
help='What callsign to send the message from.',
)
@click.option(
"--aprs-password",
envvar="APRS_PASSWORD",
'--aprs-password',
envvar='APRS_PASSWORD',
show_envvar=True,
help="the APRS-IS password for APRS_LOGIN",
help='the APRS-IS password for APRS_LOGIN',
)
@click.option(
"--packet-filter",
'--packet-filter',
type=click.Choice(
[
packets.AckPacket.__name__,
@ -170,35 +148,37 @@ class ListenStatsThread(APRSDThread):
],
case_sensitive=False,
),
help="Filter by packet type",
)
@click.option(
"--enable-plugin",
multiple=True,
help="Enable a plugin. This is the name of the file in the plugins directory.",
default=[],
help='Filter by packet type',
)
@click.option(
"--load-plugins",
'--enable-plugin',
multiple=True,
help='Enable a plugin. This is the name of the file in the plugins directory.',
)
@click.option(
'--load-plugins',
default=False,
is_flag=True,
help="Load plugins as enabled in aprsd.conf ?",
help='Load plugins as enabled in aprsd.conf ?',
)
@click.argument(
"filter",
'filter',
nargs=-1,
required=True,
)
@click.option(
"--log-packets",
'--log-packets',
default=False,
is_flag=True,
help="Log incoming packets.",
help='Log incoming packets.',
)
@click.option(
"--enable-packet-stats",
'--enable-packet-stats',
default=False,
is_flag=True,
help="Enable packet stats periodic logging.",
help='Enable packet stats periodic logging.',
)
@click.pass_context
@cli_helper.process_standard_options
@ -228,46 +208,46 @@ def listen(
if not aprs_login:
click.echo(ctx.get_help())
click.echo("")
ctx.fail("Must set --aprs-login or APRS_LOGIN")
click.echo('')
ctx.fail('Must set --aprs-login or APRS_LOGIN')
ctx.exit()
if not aprs_password:
click.echo(ctx.get_help())
click.echo("")
ctx.fail("Must set --aprs-password or APRS_PASSWORD")
click.echo('')
ctx.fail('Must set --aprs-password or APRS_PASSWORD')
ctx.exit()
# CONF.aprs_network.login = aprs_login
# config["aprs"]["password"] = aprs_password
LOG.info(f"APRSD Listen Started version: {aprsd.__version__}")
LOG.info(f'APRSD Listen Started version: {aprsd.__version__}')
CONF.log_opt_values(LOG, logging.DEBUG)
collector.Collector()
# Try and load saved MsgTrack list
LOG.debug("Loading saved MsgTrack object.")
LOG.debug('Loading saved MsgTrack object.')
# Initialize the client factory and create
# The correct client object ready for use
# Make sure we have 1 client transport enabled
if not client_factory.is_client_enabled():
LOG.error("No Clients are enabled in config.")
if not APRSDClient().is_enabled:
LOG.error('No Clients are enabled in config.')
sys.exit(-1)
# Creates the client object
LOG.info("Creating client connection")
aprs_client = client_factory.create()
LOG.info('Creating client connection')
aprs_client = APRSDClient()
LOG.info(aprs_client)
if not aprs_client.login_success:
# We failed to login, will just quit!
msg = f"Login Failure: {aprs_client.login_failure}"
msg = f'Login Failure: {aprs_client.login_failure}'
LOG.error(msg)
print(msg)
sys.exit(-1)
LOG.debug(f"Filter by '{filter}'")
LOG.debug(f"Filter messages on aprsis server by '{filter}'")
aprs_client.set_filter(filter)
keepalive_thread = keepalive.KeepAliveThread()
@ -276,10 +256,19 @@ def listen(
# just deregister the class from the packet collector
packet_collector.PacketCollector().unregister(seen_list.SeenList)
# we don't want the dupe filter to run here.
PacketFilter().unregister(dupe_filter.DupePacketFilter)
if packet_filter:
LOG.info('Enabling packet filtering for {packet_filter}')
packet_type.PacketTypeFilter().set_allow_list(packet_filter)
PacketFilter().register(packet_type.PacketTypeFilter)
else:
LOG.info('No packet filtering enabled.')
pm = None
if load_plugins:
pm = plugin.PluginManager()
LOG.info("Loading plugins")
LOG.info('Loading plugins')
pm.setup_plugins(load_help_plugin=False)
elif enable_plugin:
pm = plugin.PluginManager()
@ -290,33 +279,37 @@ def listen(
else:
LOG.warning(
"Not Loading any plugins use --load-plugins to load what's "
"defined in the config file.",
'defined in the config file.',
)
if pm:
for p in pm.get_plugins():
LOG.info("Loaded plugin %s", p.__class__.__name__)
LOG.info('Loaded plugin %s', p.__class__.__name__)
stats = stats_thread.APRSDStatsStoreThread()
stats.start()
LOG.debug("Create APRSDListenThread")
listen_thread = APRSDListenThread(
LOG.debug('Start APRSDRxThread')
rx_thread = rx.APRSDRXThread(packet_queue=threads.packet_queue)
rx_thread.start()
LOG.debug('Create APRSDListenProcessThread')
listen_thread = APRSDListenProcessThread(
packet_queue=threads.packet_queue,
packet_filter=packet_filter,
plugin_manager=pm,
enabled_plugins=enable_plugin,
log_packets=log_packets,
)
LOG.debug("Start APRSDListenThread")
LOG.debug('Start APRSDListenProcessThread')
listen_thread.start()
if enable_packet_stats:
listen_stats = ListenStatsThread()
listen_stats.start()
keepalive_thread.start()
LOG.debug("keepalive Join")
LOG.debug('keepalive Join')
keepalive_thread.join()
LOG.debug("listen_thread Join")
rx_thread.join()
listen_thread.join()
stats.join()

View File

@ -3,58 +3,60 @@ import sys
import time
import aprslib
from aprslib.exceptions import LoginError
import click
from aprslib.exceptions import LoginError
from oslo_config import cfg
import aprsd
from aprsd import cli_helper, packets
from aprsd import conf # noqa : F401
from aprsd.client import client_factory
from aprsd.main import cli
import aprsd.packets # noqa : F401
from aprsd import (
cli_helper,
conf, # noqa : F401
packets,
)
from aprsd.client.client import APRSDClient
from aprsd.main import cli
from aprsd.packets import collector
from aprsd.packets import log as packet_log
from aprsd.threads import tx
CONF = cfg.CONF
LOG = logging.getLogger("APRSD")
LOG = logging.getLogger('APRSD')
@cli.command()
@cli_helper.add_options(cli_helper.common_options)
@click.option(
"--aprs-login",
envvar="APRS_LOGIN",
'--aprs-login',
envvar='APRS_LOGIN',
show_envvar=True,
help="What callsign to send the message from. Defaults to config entry.",
help='What callsign to send the message from. Defaults to config entry.',
)
@click.option(
"--aprs-password",
envvar="APRS_PASSWORD",
'--aprs-password',
envvar='APRS_PASSWORD',
show_envvar=True,
help="the APRS-IS password for APRS_LOGIN. Defaults to config entry.",
help='the APRS-IS password for APRS_LOGIN. Defaults to config entry.',
)
@click.option(
"--no-ack",
"-n",
'--no-ack',
'-n',
is_flag=True,
show_default=True,
default=False,
help="Don't wait for an ack, just sent it to APRS-IS and bail.",
)
@click.option(
"--wait-response",
"-w",
'--wait-response',
'-w',
is_flag=True,
show_default=True,
default=False,
help="Wait for a response to the message?",
help='Wait for a response to the message?',
)
@click.option("--raw", default=None, help="Send a raw message. Implies --no-ack")
@click.argument("tocallsign", required=True)
@click.argument("command", nargs=-1, required=True)
@click.option('--raw', default=None, help='Send a raw message. Implies --no-ack')
@click.argument('tocallsign', required=True)
@click.argument('command', nargs=-1, required=True)
@click.pass_context
@cli_helper.process_standard_options
def send_message(
@ -69,11 +71,11 @@ def send_message(
):
"""Send a message to a callsign via APRS_IS."""
global got_ack, got_response
quiet = ctx.obj["quiet"]
quiet = ctx.obj['quiet']
if not aprs_login:
if CONF.aprs_network.login == conf.client.DEFAULT_LOGIN:
click.echo("Must set --aprs_login or APRS_LOGIN")
click.echo('Must set --aprs_login or APRS_LOGIN')
ctx.exit(-1)
return
else:
@ -81,15 +83,15 @@ def send_message(
if not aprs_password:
if not CONF.aprs_network.password:
click.echo("Must set --aprs-password or APRS_PASSWORD")
click.echo('Must set --aprs-password or APRS_PASSWORD')
ctx.exit(-1)
return
else:
aprs_password = CONF.aprs_network.password
LOG.info(f"APRSD LISTEN Started version: {aprsd.__version__}")
LOG.info(f'APRSD LISTEN Started version: {aprsd.__version__}')
if type(command) is tuple:
command = " ".join(command)
command = ' '.join(command)
if not quiet:
if raw:
LOG.info(f"L'{aprs_login}' R'{raw}'")
@ -101,7 +103,7 @@ def send_message(
def rx_packet(packet):
global got_ack, got_response
cl = client_factory.create()
cl = APRSDClient()
packet = cl.decode_packet(packet)
collector.PacketCollector().rx(packet)
packet_log.log(packet, tx=False)
@ -129,7 +131,7 @@ def send_message(
sys.exit(0)
try:
client_factory.create().client
APRSDClient().client # noqa: B018
except LoginError:
sys.exit(-1)
@ -140,7 +142,7 @@ def send_message(
# message
if raw:
tx.send(
packets.Packet(from_call="", to_call="", raw=raw),
packets.Packet(from_call='', to_call='', raw=raw),
direct=True,
)
sys.exit(0)
@ -161,10 +163,10 @@ def send_message(
# This will register a packet consumer with aprslib
# When new packets come in the consumer will process
# the packet
aprs_client = client_factory.create().client
aprs_client = APRSDClient()
aprs_client.consumer(rx_packet, raw=False)
except aprslib.exceptions.ConnectionDrop:
LOG.error("Connection dropped, reconnecting")
LOG.error('Connection dropped, reconnecting')
time.sleep(5)
# Force the deletion of the client object connected to aprs
# This will cause a reconnect, next time client.get_client()

View File

@ -8,63 +8,28 @@ from oslo_config import cfg
import aprsd
from aprsd import cli_helper, plugin, threads, utils
from aprsd import main as aprsd_main
from aprsd.client import client_factory
from aprsd.client.client import APRSDClient
from aprsd.main import cli
from aprsd.packets import collector as packet_collector
from aprsd.packets import seen_list
from aprsd.threads import aprsd as aprsd_threads
from aprsd.threads import keepalive, registry, rx, tx
from aprsd.threads import keepalive, registry, rx, service, tx
from aprsd.threads import stats as stats_thread
from aprsd.utils import singleton
CONF = cfg.CONF
LOG = logging.getLogger("APRSD")
@singleton
class ServerThreads:
"""Registry for threads that the server command runs.
This enables extensions to register a thread to run during
the server command.
"""
def __init__(self):
self.threads: list[aprsd_threads.APRSDThread] = []
def register(self, thread: aprsd_threads.APRSDThread):
if not isinstance(thread, aprsd_threads.APRSDThread):
raise TypeError(f"Thread {thread} is not an APRSDThread")
self.threads.append(thread)
def unregister(self, thread: aprsd_threads.APRSDThread):
if not isinstance(thread, aprsd_threads.APRSDThread):
raise TypeError(f"Thread {thread} is not an APRSDThread")
self.threads.remove(thread)
def start(self):
"""Start all threads in the list."""
for thread in self.threads:
thread.start()
def join(self):
"""Join all the threads in the list"""
for thread in self.threads:
thread.join()
LOG = logging.getLogger('APRSD')
# main() ###
@cli.command()
@cli_helper.add_options(cli_helper.common_options)
@click.option(
"-f",
"--flush",
"flush",
'-f',
'--flush',
'flush',
is_flag=True,
show_default=True,
default=False,
help="Flush out all old aged messages on disk.",
help='Flush out all old aged messages on disk.',
)
@click.pass_context
@cli_helper.process_standard_options
@ -73,37 +38,31 @@ def server(ctx, flush):
signal.signal(signal.SIGINT, aprsd_main.signal_handler)
signal.signal(signal.SIGTERM, aprsd_main.signal_handler)
server_threads = ServerThreads()
service_threads = service.ServiceThreads()
level, msg = utils._check_version()
if level:
LOG.warning(msg)
else:
LOG.info(msg)
LOG.info(f"APRSD Started version: {aprsd.__version__}")
# Initialize the client factory and create
# The correct client object ready for use
if not client_factory.is_client_enabled():
LOG.error("No Clients are enabled in config.")
sys.exit(-1)
LOG.info(f'APRSD Started version: {aprsd.__version__}')
# Make sure we have 1 client transport enabled
if not client_factory.is_client_enabled():
LOG.error("No Clients are enabled in config.")
if not APRSDClient().is_enabled:
LOG.error('No Clients are enabled in config.')
sys.exit(-1)
if not client_factory.is_client_configured():
LOG.error("APRS client is not properly configured in config file.")
if not APRSDClient().is_configured:
LOG.error('APRS client is not properly configured in config file.')
sys.exit(-1)
# Creates the client object
LOG.info("Creating client connection")
aprs_client = client_factory.create()
LOG.info('Creating client connection')
aprs_client = APRSDClient()
LOG.info(aprs_client)
if not aprs_client.login_success:
# We failed to login, will just quit!
msg = f"Login Failure: {aprs_client.login_failure}"
msg = f'Login Failure: {aprs_client.login_failure}'
LOG.error(msg)
print(msg)
sys.exit(-1)
@ -114,7 +73,7 @@ def server(ctx, flush):
# We register plugins first here so we can register each
# plugins config options, so we can dump them all in the
# log file output.
LOG.info("Loading Plugin Manager and registering plugins")
LOG.info('Loading Plugin Manager and registering plugins')
plugin_manager = plugin.PluginManager()
plugin_manager.setup_plugins(load_help_plugin=CONF.load_help_plugin)
@ -122,10 +81,10 @@ def server(ctx, flush):
CONF.log_opt_values(LOG, logging.DEBUG)
message_plugins = plugin_manager.get_message_plugins()
watchlist_plugins = plugin_manager.get_watchlist_plugins()
LOG.info("Message Plugins enabled and running:")
LOG.info('Message Plugins enabled and running:')
for p in message_plugins:
LOG.info(p)
LOG.info("Watchlist Plugins enabled and running:")
LOG.info('Watchlist Plugins enabled and running:')
for p in watchlist_plugins:
LOG.info(p)
@ -135,37 +94,37 @@ def server(ctx, flush):
# Now load the msgTrack from disk if any
if flush:
LOG.debug("Flushing All packet tracking objects.")
LOG.debug('Flushing All packet tracking objects.')
packet_collector.PacketCollector().flush()
else:
# Try and load saved MsgTrack list
LOG.debug("Loading saved packet tracking data.")
LOG.debug('Loading saved packet tracking data.')
packet_collector.PacketCollector().load()
# Now start all the main processing threads.
server_threads.register(keepalive.KeepAliveThread())
server_threads.register(stats_thread.APRSDStatsStoreThread())
server_threads.register(
rx.APRSDPluginRXThread(
service_threads.register(keepalive.KeepAliveThread())
service_threads.register(stats_thread.APRSDStatsStoreThread())
service_threads.register(
rx.APRSDRXThread(
packet_queue=threads.packet_queue,
),
)
server_threads.register(
service_threads.register(
rx.APRSDPluginProcessPacketThread(
packet_queue=threads.packet_queue,
),
)
if CONF.enable_beacon:
LOG.info("Beacon Enabled. Starting Beacon thread.")
server_threads.register(tx.BeaconSendThread())
LOG.info('Beacon Enabled. Starting Beacon thread.')
service_threads.register(tx.BeaconSendThread())
if CONF.aprs_registry.enabled:
LOG.info("Registry Enabled. Starting Registry thread.")
server_threads.register(registry.APRSRegistryThread())
LOG.info('Registry Enabled. Starting Registry thread.')
service_threads.register(registry.APRSRegistryThread())
server_threads.start()
server_threads.join()
service_threads.start()
service_threads.join()
return 0

View File

@ -3,220 +3,219 @@ from pathlib import Path
from oslo_config import cfg
home = str(Path.home())
DEFAULT_CONFIG_DIR = f"{home}/.config/aprsd/"
APRSD_DEFAULT_MAGIC_WORD = "CHANGEME!!!"
DEFAULT_CONFIG_DIR = f'{home}/.config/aprsd/'
APRSD_DEFAULT_MAGIC_WORD = 'CHANGEME!!!'
watch_list_group = cfg.OptGroup(
name="watch_list",
title="Watch List settings",
name='watch_list',
title='Watch List settings',
)
registry_group = cfg.OptGroup(
name="aprs_registry",
title="APRS Registry settings",
name='aprs_registry',
title='APRS Registry settings',
)
aprsd_opts = [
cfg.StrOpt(
"callsign",
'callsign',
required=True,
help="Callsign to use for messages sent by APRSD",
help='Callsign to use for messages sent by APRSD',
),
cfg.BoolOpt(
"enable_save",
'enable_save',
default=True,
help="Enable saving of watch list, packet tracker between restarts.",
help='Enable saving of watch list, packet tracker between restarts.',
),
cfg.StrOpt(
"save_location",
'save_location',
default=DEFAULT_CONFIG_DIR,
help="Save location for packet tracking files.",
help='Save location for packet tracking files.',
),
cfg.BoolOpt(
"trace_enabled",
'trace_enabled',
default=False,
help="Enable code tracing",
help='Enable code tracing',
),
cfg.StrOpt(
"units",
default="imperial",
help="Units for display, imperial or metric",
'units',
default='imperial',
help='Units for display, imperial or metric',
),
cfg.IntOpt(
"ack_rate_limit_period",
'ack_rate_limit_period',
default=1,
help="The wait period in seconds per Ack packet being sent."
"1 means 1 ack packet per second allowed."
"2 means 1 pack packet every 2 seconds allowed",
help='The wait period in seconds per Ack packet being sent.'
'1 means 1 ack packet per second allowed.'
'2 means 1 pack packet every 2 seconds allowed',
),
cfg.IntOpt(
"msg_rate_limit_period",
'msg_rate_limit_period',
default=2,
help="Wait period in seconds per non AckPacket being sent."
"2 means 1 packet every 2 seconds allowed."
"5 means 1 pack packet every 5 seconds allowed",
help='Wait period in seconds per non AckPacket being sent.'
'2 means 1 packet every 2 seconds allowed.'
'5 means 1 pack packet every 5 seconds allowed',
),
cfg.IntOpt(
"packet_dupe_timeout",
'packet_dupe_timeout',
default=300,
help="The number of seconds before a packet is not considered a duplicate.",
help='The number of seconds before a packet is not considered a duplicate.',
),
cfg.BoolOpt(
"enable_beacon",
'enable_beacon',
default=False,
help="Enable sending of a GPS Beacon packet to locate this service. "
"Requires latitude and longitude to be set.",
help='Enable sending of a GPS Beacon packet to locate this service. '
'Requires latitude and longitude to be set.',
),
cfg.IntOpt(
"beacon_interval",
'beacon_interval',
default=1800,
help="The number of seconds between beacon packets.",
help='The number of seconds between beacon packets.',
),
cfg.StrOpt(
"beacon_symbol",
default="/",
help="The symbol to use for the GPS Beacon packet. See: http://www.aprs.net/vm/DOS/SYMBOLS.HTM",
'beacon_symbol',
default='/',
help='The symbol to use for the GPS Beacon packet. See: http://www.aprs.net/vm/DOS/SYMBOLS.HTM',
),
cfg.StrOpt(
"latitude",
'latitude',
default=None,
help="Latitude for the GPS Beacon button. If not set, the button will not be enabled.",
help='Latitude for the GPS Beacon button. If not set, the button will not be enabled.',
),
cfg.StrOpt(
"longitude",
'longitude',
default=None,
help="Longitude for the GPS Beacon button. If not set, the button will not be enabled.",
help='Longitude for the GPS Beacon button. If not set, the button will not be enabled.',
),
cfg.StrOpt(
"log_packet_format",
choices=["compact", "multiline", "both"],
default="compact",
'log_packet_format',
choices=['compact', 'multiline', 'both'],
default='compact',
help="When logging packets 'compact' will use a single line formatted for each packet."
"'multiline' will use multiple lines for each packet and is the traditional format."
"both will log both compact and multiline.",
'both will log both compact and multiline.',
),
cfg.IntOpt(
"default_packet_send_count",
'default_packet_send_count',
default=3,
help="The number of times to send a non ack packet before giving up.",
help='The number of times to send a non ack packet before giving up.',
),
cfg.IntOpt(
"default_ack_send_count",
'default_ack_send_count',
default=3,
help="The number of times to send an ack packet in response to recieving a packet.",
help='The number of times to send an ack packet in response to recieving a packet.',
),
cfg.IntOpt(
"packet_list_maxlen",
'packet_list_maxlen',
default=100,
help="The maximum number of packets to store in the packet list.",
help='The maximum number of packets to store in the packet list.',
),
cfg.IntOpt(
"packet_list_stats_maxlen",
'packet_list_stats_maxlen',
default=20,
help="The maximum number of packets to send in the stats dict for admin ui.",
help='The maximum number of packets to send in the stats dict for admin ui. -1 means no max.',
),
cfg.BoolOpt(
"enable_seen_list",
'enable_seen_list',
default=True,
help="Enable the Callsign seen list tracking feature. This allows aprsd to keep track of "
"callsigns that have been seen and when they were last seen.",
help='Enable the Callsign seen list tracking feature. This allows aprsd to keep track of '
'callsigns that have been seen and when they were last seen.',
),
cfg.BoolOpt(
"enable_packet_logging",
'enable_packet_logging',
default=True,
help="Set this to False, to disable logging of packets to the log file.",
help='Set this to False, to disable logging of packets to the log file.',
),
cfg.BoolOpt(
"load_help_plugin",
'load_help_plugin',
default=True,
help="Set this to False to disable the help plugin.",
help='Set this to False to disable the help plugin.',
),
cfg.BoolOpt(
"enable_sending_ack_packets",
'enable_sending_ack_packets',
default=True,
help="Set this to False, to disable sending of ack packets. This will entirely stop"
"APRSD from sending ack packets.",
help='Set this to False, to disable sending of ack packets. This will entirely stop'
'APRSD from sending ack packets.',
),
]
watch_list_opts = [
cfg.BoolOpt(
"enabled",
'enabled',
default=False,
help="Enable the watch list feature. Still have to enable "
"the correct plugin. Built-in plugin to use is "
"aprsd.plugins.notify.NotifyPlugin",
help='Enable the watch list feature. Still have to enable '
'the correct plugin. Built-in plugin to use is '
'aprsd.plugins.notify.NotifyPlugin',
),
cfg.ListOpt(
"callsigns",
help="Callsigns to watch for messsages",
'callsigns',
help='Callsigns to watch for messsages',
),
cfg.StrOpt(
"alert_callsign",
help="The Ham Callsign to send messages to for watch list alerts.",
'alert_callsign',
help='The Ham Callsign to send messages to for watch list alerts.',
),
cfg.IntOpt(
"packet_keep_count",
'packet_keep_count',
default=10,
help="The number of packets to store.",
help='The number of packets to store.',
),
cfg.IntOpt(
"alert_time_seconds",
'alert_time_seconds',
default=3600,
help="Time to wait before alert is sent on new message for "
"users in callsigns.",
help='Time to wait before alert is sent on new message for users in callsigns.',
),
]
enabled_plugins_opts = [
cfg.ListOpt(
"enabled_plugins",
'enabled_plugins',
default=[
"aprsd.plugins.fortune.FortunePlugin",
"aprsd.plugins.location.LocationPlugin",
"aprsd.plugins.ping.PingPlugin",
"aprsd.plugins.time.TimePlugin",
"aprsd.plugins.weather.OWMWeatherPlugin",
"aprsd.plugins.version.VersionPlugin",
"aprsd.plugins.notify.NotifySeenPlugin",
'aprsd.plugins.fortune.FortunePlugin',
'aprsd.plugins.location.LocationPlugin',
'aprsd.plugins.ping.PingPlugin',
'aprsd.plugins.time.TimePlugin',
'aprsd.plugins.weather.OWMWeatherPlugin',
'aprsd.plugins.version.VersionPlugin',
'aprsd.plugins.notify.NotifySeenPlugin',
],
help="Comma separated list of enabled plugins for APRSD."
"To enable installed external plugins add them here."
"The full python path to the class name must be used",
help='Comma separated list of enabled plugins for APRSD.'
'To enable installed external plugins add them here.'
'The full python path to the class name must be used',
),
]
registry_opts = [
cfg.BoolOpt(
"enabled",
'enabled',
default=False,
help="Enable sending aprs registry information. This will let the "
help='Enable sending aprs registry information. This will let the '
"APRS registry know about your service and it's uptime. "
"No personal information is sent, just the callsign, uptime and description. "
"The service callsign is the callsign set in [DEFAULT] section.",
'No personal information is sent, just the callsign, uptime and description. '
'The service callsign is the callsign set in [DEFAULT] section.',
),
cfg.StrOpt(
"description",
'description',
default=None,
help="Description of the service to send to the APRS registry. "
"This is what will show up in the APRS registry."
"If not set, the description will be the same as the callsign.",
help='Description of the service to send to the APRS registry. '
'This is what will show up in the APRS registry.'
'If not set, the description will be the same as the callsign.',
),
cfg.StrOpt(
"registry_url",
default="https://aprs.hemna.com/api/v1/registry",
help="The APRS registry domain name to send the information to.",
'registry_url',
default='https://aprs.hemna.com/api/v1/registry',
help='The APRS registry domain name to send the information to.',
),
cfg.StrOpt(
"service_website",
'service_website',
default=None,
help="The website for your APRS service to send to the APRS registry.",
help='The website for your APRS service to send to the APRS registry.',
),
cfg.IntOpt(
"frequency_seconds",
'frequency_seconds',
default=3600,
help="The frequency in seconds to send the APRS registry information.",
help='The frequency in seconds to send the APRS registry information.',
),
]
@ -232,7 +231,7 @@ def register_opts(config):
def list_opts():
return {
"DEFAULT": (aprsd_opts + enabled_plugins_opts),
'DEFAULT': (aprsd_opts + enabled_plugins_opts),
watch_list_group.name: watch_list_opts,
registry_group.name: registry_opts,
}

View File

@ -7,47 +7,57 @@ import logging
from oslo_config import cfg
LOG_LEVELS = {
"CRITICAL": logging.CRITICAL,
"ERROR": logging.ERROR,
"WARNING": logging.WARNING,
"INFO": logging.INFO,
"DEBUG": logging.DEBUG,
'CRITICAL': logging.CRITICAL,
'ERROR': logging.ERROR,
'WARNING': logging.WARNING,
'INFO': logging.INFO,
'DEBUG': logging.DEBUG,
}
DEFAULT_DATE_FORMAT = "%m/%d/%Y %I:%M:%S %p"
DEFAULT_DATE_FORMAT = '%m/%d/%Y %I:%M:%S %p'
DEFAULT_LOG_FORMAT = (
"[%(asctime)s] [%(threadName)-20.20s] [%(levelname)-5.5s]"
" %(message)s - [%(pathname)s:%(lineno)d]"
'[%(asctime)s] [%(threadName)-20.20s] [%(levelname)-5.5s]'
' %(message)s - [%(pathname)s:%(lineno)d]'
)
DEFAULT_LOG_FORMAT = (
"<green>{time:YYYY-MM-DD HH:mm:ss.SSS}</green> | "
"<yellow>{thread.name: <18}</yellow> | "
"<level>{level: <8}</level> | "
"<level>{message}</level> | "
"<cyan>{name}</cyan>:<cyan>{function:}</cyan>:<magenta>{line:}</magenta>"
'<green>{time:YYYY-MM-DD HH:mm:ss.SSS}</green> | '
'<yellow>{thread.name: <18}</yellow> | '
'<level>{level: <8}</level> | '
'<level>{message}</level> | '
'<cyan>{name}</cyan>:<cyan>{function:}</cyan>:<magenta>{line:}</magenta>'
)
logging_group = cfg.OptGroup(
name="logging",
title="Logging options",
name='logging',
title='Logging options',
)
logging_opts = [
cfg.StrOpt(
"logfile",
'logfile',
default=None,
help="File to log to",
help='File to log to',
),
cfg.StrOpt(
"logformat",
'logformat',
default=DEFAULT_LOG_FORMAT,
help="Log file format, unless rich_logging enabled.",
help='Log file format, unless rich_logging enabled.',
),
cfg.StrOpt(
"log_level",
default="INFO",
'log_level',
default='INFO',
choices=LOG_LEVELS.keys(),
help="Log level for logging of events.",
help='Log level for logging of events.',
),
cfg.BoolOpt(
'enable_color',
default=True,
help='Enable ANSI color codes in logging',
),
cfg.BoolOpt(
'enable_console_stdout',
default=True,
help='Enable logging to the console/stdout.',
),
]

View File

@ -51,7 +51,7 @@ class InterceptHandler(logging.Handler):
# Setup the log faciility
# to disable log to stdout, but still log to file
# use the --quiet option on the cmdln
def setup_logging(loglevel=None, quiet=False):
def setup_logging(loglevel=None, quiet=False, custom_handler=None):
if not loglevel:
log_level = CONF.logging.log_level
else:
@ -63,37 +63,53 @@ def setup_logging(loglevel=None, quiet=False):
# We don't really want to see the aprslib parsing debug output.
disable_list = [
"aprslib",
"aprslib.parsing",
"aprslib.exceptions",
'aprslib',
'aprslib.parsing',
'aprslib.exceptions',
]
chardet_list = [
'chardet',
'chardet.charsetprober',
'chardet.eucjpprober',
]
for name in chardet_list:
disable = logging.getLogger(name)
disable.setLevel(logging.ERROR)
# remove every other logger's handlers
# and propagate to root logger
for name in logging.root.manager.loggerDict.keys():
logging.getLogger(name).handlers = []
logging.getLogger(name).propagate = name not in disable_list
handlers = [
{
"sink": sys.stdout,
"serialize": False,
"format": CONF.logging.logformat,
"colorize": True,
"level": log_level,
},
]
if CONF.logging.logfile:
handlers = []
if CONF.logging.enable_console_stdout and not quiet:
handlers.append(
{
"sink": CONF.logging.logfile,
"serialize": False,
"format": CONF.logging.logformat,
"colorize": False,
"level": log_level,
'sink': sys.stdout,
'serialize': False,
'format': CONF.logging.logformat,
'colorize': CONF.logging.enable_color,
'level': log_level,
},
)
if CONF.logging.logfile:
handlers.append(
{
'sink': CONF.logging.logfile,
'serialize': False,
'format': CONF.logging.logformat,
'colorize': False,
'level': log_level,
},
)
if custom_handler:
handlers.append(custom_handler)
# configure loguru
logger.configure(handlers=handlers)
logger.level("DEBUG", color="<fg #BABABA>")
logger.level('DEBUG', color='<fg #BABABA>')

View File

@ -23,7 +23,6 @@
import datetime
import importlib.metadata as imp
import logging
import signal
import sys
import time
from importlib.metadata import version as metadata_version
@ -39,9 +38,8 @@ from aprsd.stats import collector
# setup the global logger
# log.basicConfig(level=log.DEBUG) # level=10
CONF = cfg.CONF
LOG = logging.getLogger("APRSD")
CONTEXT_SETTINGS = dict(help_option_names=["-h", "--help"])
flask_enabled = False
LOG = logging.getLogger('APRSD')
CONTEXT_SETTINGS = dict(help_option_names=['-h', '--help'])
@click.group(cls=cli_helper.AliasedGroup, context_settings=CONTEXT_SETTINGS)
@ -68,18 +66,16 @@ def main():
# First import all the possible commands for the CLI
# The commands themselves live in the cmds directory
load_commands()
utils.load_entry_points("aprsd.extension")
cli(auto_envvar_prefix="APRSD")
utils.load_entry_points('aprsd.extension')
cli(auto_envvar_prefix='APRSD')
def signal_handler(sig, frame):
global flask_enabled
click.echo("signal_handler: called")
click.echo('signal_handler: called')
threads.APRSDThreadList().stop_all()
if "subprocess" not in str(frame):
if 'subprocess' not in str(frame):
LOG.info(
"Ctrl+C, Sending all threads exit! Can take up to 10 seconds {}".format(
'Ctrl+C, Sending all threads exit! Can take up to 10 seconds {}'.format(
datetime.datetime.now(),
),
)
@ -91,14 +87,11 @@ def signal_handler(sig, frame):
packets.PacketList().save()
collector.Collector().collect()
except Exception as e:
LOG.error(f"Failed to save data: {e}")
LOG.error(f'Failed to save data: {e}')
sys.exit(0)
# signal.signal(signal.SIGTERM, sys.exit(0))
# sys.exit(0)
if flask_enabled:
signal.signal(signal.SIGTERM, sys.exit(0))
@cli.command()
@cli_helper.add_options(cli_helper.common_options)
@ -108,9 +101,9 @@ def check_version(ctx):
"""Check this version against the latest in pypi.org."""
level, msg = utils._check_version()
if level:
click.secho(msg, fg="yellow")
click.secho(msg, fg='yellow')
else:
click.secho(msg, fg="green")
click.secho(msg, fg='green')
@cli.command()
@ -124,12 +117,12 @@ def sample_config(ctx):
if sys.version_info < (3, 10):
all = imp.entry_points()
selected = []
if "oslo.config.opts" in all:
for x in all["oslo.config.opts"]:
if x.group == "oslo.config.opts":
if 'oslo.config.opts' in all:
for x in all['oslo.config.opts']:
if x.group == 'oslo.config.opts':
selected.append(x)
else:
selected = imp.entry_points(group="oslo.config.opts")
selected = imp.entry_points(group='oslo.config.opts')
return selected
@ -139,23 +132,23 @@ def sample_config(ctx):
# selected = imp.entry_points(group="oslo.config.opts")
selected = _get_selected_entry_points()
for entry in selected:
if "aprsd" in entry.name:
args.append("--namespace")
if 'aprsd' in entry.name:
args.append('--namespace')
args.append(entry.name)
return args
args = get_namespaces()
config_version = metadata_version("oslo.config")
config_version = metadata_version('oslo.config')
logging.basicConfig(level=logging.WARN)
conf = cfg.ConfigOpts()
generator.register_cli_opts(conf)
try:
conf(args, version=config_version)
except cfg.RequiredOptError:
except cfg.RequiredOptError as ex:
conf.print_help()
if not sys.argv[1:]:
raise SystemExit
raise SystemExit from ex
raise
generator.generate(conf)
return
@ -165,9 +158,9 @@ def sample_config(ctx):
@click.pass_context
def version(ctx):
"""Show the APRSD version."""
click.echo(click.style("APRSD Version : ", fg="white"), nl=False)
click.secho(f"{aprsd.__version__}", fg="yellow", bold=True)
click.echo(click.style('APRSD Version : ', fg='white'), nl=False)
click.secho(f'{aprsd.__version__}', fg='yellow', bold=True)
if __name__ == "__main__":
if __name__ == '__main__':
main()

View File

@ -15,6 +15,8 @@ from aprsd.packets.core import ( # noqa: F401
WeatherPacket,
factory,
)
from aprsd.packets.filter import PacketFilter
from aprsd.packets.filters.dupe_filter import DupePacketFilter
from aprsd.packets.packet_list import PacketList # noqa: F401
from aprsd.packets.seen_list import SeenList # noqa: F401
from aprsd.packets.tracker import PacketTrack # noqa: F401
@ -26,5 +28,9 @@ collector.PacketCollector().register(SeenList)
collector.PacketCollector().register(PacketTrack)
collector.PacketCollector().register(WatchList)
# Register all the packet filters for normal processing
# For specific commands you can deregister these if you don't want them.
PacketFilter().register(DupePacketFilter)
NULL_MESSAGE = -1

View File

@ -19,26 +19,26 @@ from loguru import logger
from aprsd.utils import counter
# For mypy to be happy
A = TypeVar("A", bound="DataClassJsonMixin")
A = TypeVar('A', bound='DataClassJsonMixin')
Json = Union[dict, list, str, int, float, bool, None]
LOG = logging.getLogger()
LOGU = logger
PACKET_TYPE_BULLETIN = "bulletin"
PACKET_TYPE_MESSAGE = "message"
PACKET_TYPE_ACK = "ack"
PACKET_TYPE_REJECT = "reject"
PACKET_TYPE_MICE = "mic-e"
PACKET_TYPE_WX = "wx"
PACKET_TYPE_WEATHER = "weather"
PACKET_TYPE_OBJECT = "object"
PACKET_TYPE_UNKNOWN = "unknown"
PACKET_TYPE_STATUS = "status"
PACKET_TYPE_BEACON = "beacon"
PACKET_TYPE_THIRDPARTY = "thirdparty"
PACKET_TYPE_TELEMETRY = "telemetry-message"
PACKET_TYPE_UNCOMPRESSED = "uncompressed"
PACKET_TYPE_BULLETIN = 'bulletin'
PACKET_TYPE_MESSAGE = 'message'
PACKET_TYPE_ACK = 'ack'
PACKET_TYPE_REJECT = 'reject'
PACKET_TYPE_MICE = 'mic-e'
PACKET_TYPE_WX = 'wx'
PACKET_TYPE_WEATHER = 'weather'
PACKET_TYPE_OBJECT = 'object'
PACKET_TYPE_UNKNOWN = 'unknown'
PACKET_TYPE_STATUS = 'status'
PACKET_TYPE_BEACON = 'beacon'
PACKET_TYPE_THIRDPARTY = 'thirdparty'
PACKET_TYPE_TELEMETRY = 'telemetry-message'
PACKET_TYPE_UNCOMPRESSED = 'uncompressed'
NO_DATE = datetime(1900, 10, 24)
@ -67,14 +67,14 @@ def _init_msgNo(): # noqa: N802
def _translate_fields(raw: dict) -> dict:
# Direct key checks instead of iteration
if "from" in raw:
raw["from_call"] = raw.pop("from")
if "to" in raw:
raw["to_call"] = raw.pop("to")
if 'from' in raw:
raw['from_call'] = raw.pop('from')
if 'to' in raw:
raw['to_call'] = raw.pop('to')
# addresse overrides to_call
if "addresse" in raw:
raw["to_call"] = raw["addresse"]
if 'addresse' in raw:
raw['to_call'] = raw['addresse']
return raw
@ -82,7 +82,7 @@ def _translate_fields(raw: dict) -> dict:
@dataclass_json
@dataclass(unsafe_hash=True)
class Packet:
_type: str = field(default="Packet", hash=False)
_type: str = field(default='Packet', hash=False)
from_call: Optional[str] = field(default=None)
to_call: Optional[str] = field(default=None)
addresse: Optional[str] = field(default=None)
@ -106,6 +106,8 @@ class Packet:
last_send_time: float = field(repr=False, default=0, compare=False, hash=False)
# Was the packet acked?
acked: bool = field(repr=False, default=False, compare=False, hash=False)
# Was the packet previously processed (for dupe checking)
processed: bool = field(repr=False, default=False, compare=False, hash=False)
# Do we allow this packet to be saved to send later?
allow_delay: bool = field(repr=False, default=True, compare=False, hash=False)
@ -118,7 +120,7 @@ class Packet:
@property
def key(self) -> str:
"""Build a key for finding this packet in a dict."""
return f"{self.from_call}:{self.addresse}:{self.msgNo}"
return f'{self.from_call}:{self.addresse}:{self.msgNo}'
def update_timestamp(self) -> None:
self.timestamp = _init_timestamp()
@ -131,7 +133,7 @@ class Packet:
the human readable payload.
"""
self.prepare()
msg = self._filter_for_send(self.raw).rstrip("\n")
msg = self._filter_for_send(self.raw).rstrip('\n')
return msg
def prepare(self, create_msg_number=False) -> None:
@ -150,11 +152,11 @@ class Packet:
)
# The base packet class has no real payload
self.payload = f":{self.to_call.ljust(9)}"
self.payload = f':{self.to_call.ljust(9)}'
def _build_raw(self) -> None:
"""Build the self.raw which is what is sent over the air."""
self.raw = "{}>APZ100:{}".format(
self.raw = '{}>APZ100:{}'.format(
self.from_call,
self.payload,
)
@ -166,13 +168,13 @@ class Packet:
# 67 displays 64 on the ftm400. (+3 {01 suffix)
# feature req: break long ones into two msgs
if not msg:
return ""
return ''
message = msg[:67]
# We all miss George Carlin
return re.sub(
"fuck|shit|cunt|piss|cock|bitch",
"****",
'fuck|shit|cunt|piss|cock|bitch',
'****',
message,
flags=re.IGNORECASE,
)
@ -181,101 +183,98 @@ class Packet:
"""Show the raw version of the packet"""
self.prepare()
if not self.raw:
raise ValueError("self.raw is unset")
raise ValueError('self.raw is unset')
return self.raw
def __repr__(self) -> str:
"""Build the repr version of the packet."""
repr = (
f"{self.__class__.__name__}:"
f" From: {self.from_call} "
f" To: {self.to_call}"
return (
f'{self.__class__.__name__}: From: {self.from_call} To: {self.to_call}'
)
return repr
@dataclass_json
@dataclass(unsafe_hash=True)
class AckPacket(Packet):
_type: str = field(default="AckPacket", hash=False)
_type: str = field(default='AckPacket', hash=False)
def _build_payload(self):
self.payload = f":{self.to_call: <9}:ack{self.msgNo}"
self.payload = f':{self.to_call: <9}:ack{self.msgNo}'
@dataclass_json
@dataclass(unsafe_hash=True)
class BulletinPacket(Packet):
_type: str = "BulletinPacket"
_type: str = 'BulletinPacket'
# Holds the encapsulated packet
bid: Optional[str] = field(default="1")
bid: Optional[str] = field(default='1')
message_text: Optional[str] = field(default=None)
@property
def key(self) -> str:
"""Build a key for finding this packet in a dict."""
return f"{self.from_call}:BLN{self.bid}"
return f'{self.from_call}:BLN{self.bid}'
@property
def human_info(self) -> str:
return f"BLN{self.bid} {self.message_text}"
return f'BLN{self.bid} {self.message_text}'
def _build_payload(self) -> None:
self.payload = f":BLN{self.bid:<9}" f":{self.message_text}"
self.payload = f':BLN{self.bid:<9}:{self.message_text}'
@dataclass_json
@dataclass(unsafe_hash=True)
class RejectPacket(Packet):
_type: str = field(default="RejectPacket", hash=False)
_type: str = field(default='RejectPacket', hash=False)
response: Optional[str] = field(default=None)
def __post__init__(self):
if self.response:
LOG.warning("Response set!")
LOG.warning('Response set!')
def _build_payload(self):
self.payload = f":{self.to_call: <9}:rej{self.msgNo}"
self.payload = f':{self.to_call: <9}:rej{self.msgNo}'
@dataclass_json
@dataclass(unsafe_hash=True)
class MessagePacket(Packet):
_type: str = field(default="MessagePacket", hash=False)
_type: str = field(default='MessagePacket', hash=False)
message_text: Optional[str] = field(default=None)
@property
def human_info(self) -> str:
self.prepare()
return self._filter_for_send(self.message_text).rstrip("\n")
return self._filter_for_send(self.message_text).rstrip('\n')
def _build_payload(self):
if self.msgNo:
self.payload = ":{}:{}{{{}".format(
self.payload = ':{}:{}{{{}'.format(
self.to_call.ljust(9),
self._filter_for_send(self.message_text).rstrip("\n"),
self._filter_for_send(self.message_text).rstrip('\n'),
str(self.msgNo),
)
else:
self.payload = ":{}:{}".format(
self.payload = ':{}:{}'.format(
self.to_call.ljust(9),
self._filter_for_send(self.message_text).rstrip("\n"),
self._filter_for_send(self.message_text).rstrip('\n'),
)
@dataclass_json
@dataclass(unsafe_hash=True)
class StatusPacket(Packet):
_type: str = field(default="StatusPacket", hash=False)
_type: str = field(default='StatusPacket', hash=False)
status: Optional[str] = field(default=None)
messagecapable: bool = field(default=False)
comment: Optional[str] = field(default=None)
raw_timestamp: Optional[str] = field(default=None)
def _build_payload(self):
self.payload = ":{}:{}{{{}".format(
self.payload = ':{}:{}{{{}'.format(
self.to_call.ljust(9),
self._filter_for_send(self.status).rstrip("\n"),
self._filter_for_send(self.status).rstrip('\n'),
str(self.msgNo),
)
@ -288,7 +287,7 @@ class StatusPacket(Packet):
@dataclass_json
@dataclass(unsafe_hash=True)
class GPSPacket(Packet):
_type: str = field(default="GPSPacket", hash=False)
_type: str = field(default='GPSPacket', hash=False)
latitude: float = field(default=0.00)
longitude: float = field(default=0.00)
altitude: float = field(default=0.00)
@ -296,8 +295,8 @@ class GPSPacket(Packet):
posambiguity: int = field(default=0)
messagecapable: bool = field(default=False)
comment: Optional[str] = field(default=None)
symbol: str = field(default="l")
symbol_table: str = field(default="/")
symbol: str = field(default='l')
symbol_table: str = field(default='/')
raw_timestamp: Optional[str] = field(default=None)
object_name: Optional[str] = field(default=None)
object_format: Optional[str] = field(default=None)
@ -317,7 +316,7 @@ class GPSPacket(Packet):
def _build_time_zulu(self):
"""Build the timestamp in UTC/zulu."""
if self.timestamp:
return datetime.utcfromtimestamp(self.timestamp).strftime("%d%H%M")
return datetime.utcfromtimestamp(self.timestamp).strftime('%d%H%M')
def _build_payload(self):
"""The payload is the non headers portion of the packet."""
@ -325,7 +324,7 @@ class GPSPacket(Packet):
lat = aprslib_util.latitude_to_ddm(self.latitude)
long = aprslib_util.longitude_to_ddm(self.longitude)
payload = [
"@" if self.timestamp else "!",
'@' if self.timestamp else '!',
time_zulu,
lat,
self.symbol_table,
@ -336,34 +335,34 @@ class GPSPacket(Packet):
if self.comment:
payload.append(self._filter_for_send(self.comment))
self.payload = "".join(payload)
self.payload = ''.join(payload)
def _build_raw(self):
self.raw = f"{self.from_call}>{self.to_call},WIDE2-1:" f"{self.payload}"
self.raw = f'{self.from_call}>{self.to_call},WIDE2-1:{self.payload}'
@property
def human_info(self) -> str:
h_str = []
h_str.append(f"Lat:{self.latitude:03.3f}")
h_str.append(f"Lon:{self.longitude:03.3f}")
h_str.append(f'Lat:{self.latitude:03.3f}')
h_str.append(f'Lon:{self.longitude:03.3f}')
if self.altitude:
h_str.append(f"Altitude {self.altitude:03.0f}")
h_str.append(f'Altitude {self.altitude:03.0f}')
if self.speed:
h_str.append(f"Speed {self.speed:03.0f}MPH")
h_str.append(f'Speed {self.speed:03.0f}MPH')
if self.course:
h_str.append(f"Course {self.course:03.0f}")
h_str.append(f'Course {self.course:03.0f}')
if self.rng:
h_str.append(f"RNG {self.rng:03.0f}")
h_str.append(f'RNG {self.rng:03.0f}')
if self.phg:
h_str.append(f"PHG {self.phg}")
h_str.append(f'PHG {self.phg}')
return " ".join(h_str)
return ' '.join(h_str)
@dataclass_json
@dataclass(unsafe_hash=True)
class BeaconPacket(GPSPacket):
_type: str = field(default="BeaconPacket", hash=False)
_type: str = field(default='BeaconPacket', hash=False)
def _build_payload(self):
"""The payload is the non headers portion of the packet."""
@ -371,42 +370,42 @@ class BeaconPacket(GPSPacket):
lat = aprslib_util.latitude_to_ddm(self.latitude)
lon = aprslib_util.longitude_to_ddm(self.longitude)
self.payload = f"@{time_zulu}z{lat}{self.symbol_table}" f"{lon}"
self.payload = f'@{time_zulu}z{lat}{self.symbol_table}{lon}'
if self.comment:
comment = self._filter_for_send(self.comment)
self.payload = f"{self.payload}{self.symbol}{comment}"
self.payload = f'{self.payload}{self.symbol}{comment}'
else:
self.payload = f"{self.payload}{self.symbol}APRSD Beacon"
self.payload = f'{self.payload}{self.symbol}APRSD Beacon'
def _build_raw(self):
self.raw = f"{self.from_call}>APZ100:" f"{self.payload}"
self.raw = f'{self.from_call}>APZ100:{self.payload}'
@property
def key(self) -> str:
"""Build a key for finding this packet in a dict."""
if self.raw_timestamp:
return f"{self.from_call}:{self.raw_timestamp}"
return f'{self.from_call}:{self.raw_timestamp}'
else:
return f"{self.from_call}:{self.human_info.replace(' ', '')}"
return f'{self.from_call}:{self.human_info.replace(" ", "")}'
@property
def human_info(self) -> str:
h_str = []
h_str.append(f"Lat:{self.latitude:03.3f}")
h_str.append(f"Lon:{self.longitude:03.3f}")
h_str.append(f"{self.comment}")
return " ".join(h_str)
h_str.append(f'Lat:{self.latitude:03.3f}')
h_str.append(f'Lon:{self.longitude:03.3f}')
h_str.append(f'{self.comment}')
return ' '.join(h_str)
@dataclass_json
@dataclass(unsafe_hash=True)
class MicEPacket(GPSPacket):
_type: str = field(default="MicEPacket", hash=False)
_type: str = field(default='MicEPacket', hash=False)
messagecapable: bool = False
mbits: Optional[str] = None
mtype: Optional[str] = None
telemetry: Optional[dict] = field(default=None)
telemetry: Optional[dict] = field(default=None, hash=False)
# in MPH
speed: float = 0.00
# 0 to 360
@ -415,24 +414,24 @@ class MicEPacket(GPSPacket):
@property
def key(self) -> str:
"""Build a key for finding this packet in a dict."""
return f"{self.from_call}:{self.human_info.replace(' ', '')}"
return f'{self.from_call}:{self.human_info.replace(" ", "")}'
@property
def human_info(self) -> str:
h_info = super().human_info
return f"{h_info} {self.mbits} mbits"
return f'{h_info} {self.mbits} mbits'
@dataclass_json
@dataclass(unsafe_hash=True)
class TelemetryPacket(GPSPacket):
_type: str = field(default="TelemetryPacket", hash=False)
_type: str = field(default='TelemetryPacket', hash=False)
messagecapable: bool = False
mbits: Optional[str] = None
mtype: Optional[str] = None
telemetry: Optional[dict] = field(default=None)
tPARM: Optional[list[str]] = field(default=None) # noqa: N815
tUNIT: Optional[list[str]] = field(default=None) # noqa: N815
tPARM: Optional[list[str]] = field(default=None, hash=False) # noqa: N815
tUNIT: Optional[list[str]] = field(default=None, hash=False) # noqa: N815
# in MPH
speed: float = 0.00
# 0 to 360
@ -442,23 +441,23 @@ class TelemetryPacket(GPSPacket):
def key(self) -> str:
"""Build a key for finding this packet in a dict."""
if self.raw_timestamp:
return f"{self.from_call}:{self.raw_timestamp}"
return f'{self.from_call}:{self.raw_timestamp}'
else:
return f"{self.from_call}:{self.human_info.replace(' ', '')}"
return f'{self.from_call}:{self.human_info.replace(" ", "")}'
@property
def human_info(self) -> str:
h_info = super().human_info
return f"{h_info} {self.telemetry}"
return f'{h_info} {self.telemetry}'
@dataclass_json
@dataclass(unsafe_hash=True)
class ObjectPacket(GPSPacket):
_type: str = field(default="ObjectPacket", hash=False)
_type: str = field(default='ObjectPacket', hash=False)
alive: bool = True
raw_timestamp: Optional[str] = None
symbol: str = field(default="r")
symbol: str = field(default='r')
# in MPH
speed: float = 0.00
# 0 to 360
@ -469,11 +468,11 @@ class ObjectPacket(GPSPacket):
lat = aprslib_util.latitude_to_ddm(self.latitude)
long = aprslib_util.longitude_to_ddm(self.longitude)
self.payload = f"*{time_zulu}z{lat}{self.symbol_table}" f"{long}{self.symbol}"
self.payload = f'*{time_zulu}z{lat}{self.symbol_table}{long}{self.symbol}'
if self.comment:
comment = self._filter_for_send(self.comment)
self.payload = f"{self.payload}{comment}"
self.payload = f'{self.payload}{comment}'
def _build_raw(self):
"""
@ -486,18 +485,18 @@ class ObjectPacket(GPSPacket):
The frequency, uplink_tone, offset is part of the comment
"""
self.raw = f"{self.from_call}>APZ100:;{self.to_call:9s}" f"{self.payload}"
self.raw = f'{self.from_call}>APZ100:;{self.to_call:9s}{self.payload}'
@property
def human_info(self) -> str:
h_info = super().human_info
return f"{h_info} {self.comment}"
return f'{h_info} {self.comment}'
@dataclass(unsafe_hash=True)
class WeatherPacket(GPSPacket, DataClassJsonMixin):
_type: str = field(default="WeatherPacket", hash=False)
symbol: str = "_"
_type: str = field(default='WeatherPacket', hash=False)
symbol: str = '_'
wind_speed: float = 0.00
wind_direction: int = 0
wind_gust: float = 0.00
@ -515,8 +514,8 @@ class WeatherPacket(GPSPacket, DataClassJsonMixin):
speed: Optional[float] = field(default=None)
def _translate(self, raw: dict) -> dict:
for key in raw["weather"]:
raw[key] = raw["weather"][key]
for key in raw['weather']:
raw[key] = raw['weather'][key]
# If we have the broken aprslib, then we need to
# Convert the course and speed to wind_speed and wind_direction
@ -524,36 +523,36 @@ class WeatherPacket(GPSPacket, DataClassJsonMixin):
# https://github.com/rossengeorgiev/aprs-python/issues/80
# Wind speed and course is option in the SPEC.
# For some reason aprslib multiplies the speed by 1.852.
if "wind_speed" not in raw and "wind_direction" not in raw:
if 'wind_speed' not in raw and 'wind_direction' not in raw:
# Most likely this is the broken aprslib
# So we need to convert the wind_gust speed
raw["wind_gust"] = round(raw.get("wind_gust", 0) / 0.44704, 3)
if "wind_speed" not in raw:
wind_speed = raw.get("speed")
raw['wind_gust'] = round(raw.get('wind_gust', 0) / 0.44704, 3)
if 'wind_speed' not in raw:
wind_speed = raw.get('speed')
if wind_speed:
raw["wind_speed"] = round(wind_speed / 1.852, 3)
raw["weather"]["wind_speed"] = raw["wind_speed"]
if "speed" in raw:
del raw["speed"]
raw['wind_speed'] = round(wind_speed / 1.852, 3)
raw['weather']['wind_speed'] = raw['wind_speed']
if 'speed' in raw:
del raw['speed']
# Let's adjust the rain numbers as well, since it's wrong
raw["rain_1h"] = round((raw.get("rain_1h", 0) / 0.254) * 0.01, 3)
raw["weather"]["rain_1h"] = raw["rain_1h"]
raw["rain_24h"] = round((raw.get("rain_24h", 0) / 0.254) * 0.01, 3)
raw["weather"]["rain_24h"] = raw["rain_24h"]
raw["rain_since_midnight"] = round(
(raw.get("rain_since_midnight", 0) / 0.254) * 0.01, 3
raw['rain_1h'] = round((raw.get('rain_1h', 0) / 0.254) * 0.01, 3)
raw['weather']['rain_1h'] = raw['rain_1h']
raw['rain_24h'] = round((raw.get('rain_24h', 0) / 0.254) * 0.01, 3)
raw['weather']['rain_24h'] = raw['rain_24h']
raw['rain_since_midnight'] = round(
(raw.get('rain_since_midnight', 0) / 0.254) * 0.01, 3
)
raw["weather"]["rain_since_midnight"] = raw["rain_since_midnight"]
raw['weather']['rain_since_midnight'] = raw['rain_since_midnight']
if "wind_direction" not in raw:
wind_direction = raw.get("course")
if 'wind_direction' not in raw:
wind_direction = raw.get('course')
if wind_direction:
raw["wind_direction"] = wind_direction
raw["weather"]["wind_direction"] = raw["wind_direction"]
if "course" in raw:
del raw["course"]
raw['wind_direction'] = wind_direction
raw['weather']['wind_direction'] = raw['wind_direction']
if 'course' in raw:
del raw['course']
del raw["weather"]
del raw['weather']
return raw
@classmethod
@ -566,20 +565,20 @@ class WeatherPacket(GPSPacket, DataClassJsonMixin):
def key(self) -> str:
"""Build a key for finding this packet in a dict."""
if self.raw_timestamp:
return f"{self.from_call}:{self.raw_timestamp}"
return f'{self.from_call}:{self.raw_timestamp}'
elif self.wx_raw_timestamp:
return f"{self.from_call}:{self.wx_raw_timestamp}"
return f'{self.from_call}:{self.wx_raw_timestamp}'
@property
def human_info(self) -> str:
h_str = []
h_str.append(f"Temp {self.temperature:03.0f}F")
h_str.append(f"Humidity {self.humidity}%")
h_str.append(f"Wind {self.wind_speed:03.0f}MPH@{self.wind_direction}")
h_str.append(f"Pressure {self.pressure}mb")
h_str.append(f"Rain {self.rain_24h}in/24hr")
h_str.append(f'Temp {self.temperature:03.0f}F')
h_str.append(f'Humidity {self.humidity}%')
h_str.append(f'Wind {self.wind_speed:03.0f}MPH@{self.wind_direction}')
h_str.append(f'Pressure {self.pressure}mb')
h_str.append(f'Rain {self.rain_24h}in/24hr')
return " ".join(h_str)
return ' '.join(h_str)
def _build_payload(self):
"""Build an uncompressed weather packet
@ -609,49 +608,49 @@ class WeatherPacket(GPSPacket, DataClassJsonMixin):
time_zulu = self._build_time_zulu()
contents = [
f"@{time_zulu}z{self.latitude}{self.symbol_table}",
f"{self.longitude}{self.symbol}",
f"{self.wind_direction:03d}",
f'@{time_zulu}z{self.latitude}{self.symbol_table}',
f'{self.longitude}{self.symbol}',
f'{self.wind_direction:03d}',
# Speed = sustained 1 minute wind speed in mph
f"{self.symbol_table}",
f"{self.wind_speed:03.0f}",
f'{self.symbol_table}',
f'{self.wind_speed:03.0f}',
# wind gust (peak wind speed in mph in the last 5 minutes)
f"g{self.wind_gust:03.0f}",
f'g{self.wind_gust:03.0f}',
# Temperature in degrees F
f"t{self.temperature:03.0f}",
f't{self.temperature:03.0f}',
# Rainfall (in hundredths of an inch) in the last hour
f"r{self.rain_1h * 100:03.0f}",
f'r{self.rain_1h * 100:03.0f}',
# Rainfall (in hundredths of an inch) in last 24 hours
f"p{self.rain_24h * 100:03.0f}",
f'p{self.rain_24h * 100:03.0f}',
# Rainfall (in hundredths of an inch) since midnigt
f"P{self.rain_since_midnight * 100:03.0f}",
f'P{self.rain_since_midnight * 100:03.0f}',
# Humidity
f"h{self.humidity:02d}",
f'h{self.humidity:02d}',
# Barometric pressure (in tenths of millibars/tenths of hPascal)
f"b{self.pressure:05.0f}",
f'b{self.pressure:05.0f}',
]
if self.comment:
comment = self.filter_for_send(self.comment)
comment = self._filter_for_send(self.comment)
contents.append(comment)
self.payload = "".join(contents)
self.payload = ''.join(contents)
def _build_raw(self):
self.raw = f"{self.from_call}>{self.to_call},WIDE1-1,WIDE2-1:" f"{self.payload}"
self.raw = f'{self.from_call}>{self.to_call},WIDE1-1,WIDE2-1:{self.payload}'
@dataclass(unsafe_hash=True)
class ThirdPartyPacket(Packet, DataClassJsonMixin):
_type: str = "ThirdPartyPacket"
_type: str = 'ThirdPartyPacket'
# Holds the encapsulated packet
subpacket: Optional[type[Packet]] = field(default=None, compare=True, hash=False)
def __repr__(self):
"""Build the repr version of the packet."""
repr_str = (
f"{self.__class__.__name__}:"
f" From: {self.from_call} "
f" To: {self.to_call} "
f" Subpacket: {repr(self.subpacket)}"
f'{self.__class__.__name__}:'
f' From: {self.from_call} '
f' To: {self.to_call} '
f' Subpacket: {repr(self.subpacket)}'
)
return repr_str
@ -665,12 +664,12 @@ class ThirdPartyPacket(Packet, DataClassJsonMixin):
@property
def key(self) -> str:
"""Build a key for finding this packet in a dict."""
return f"{self.from_call}:{self.subpacket.key}"
return f'{self.from_call}:{self.subpacket.key}'
@property
def human_info(self) -> str:
sub_info = self.subpacket.human_info
return f"{self.from_call}->{self.to_call} {sub_info}"
return f'{self.from_call}->{self.to_call} {sub_info}'
@dataclass_json(undefined=Undefined.INCLUDE)
@ -682,11 +681,12 @@ class UnknownPacket:
"""
unknown_fields: CatchAll
_type: str = "UnknownPacket"
_type: str = 'UnknownPacket'
from_call: Optional[str] = field(default=None)
to_call: Optional[str] = field(default=None)
msgNo: str = field(default_factory=_init_msgNo) # noqa: N815
format: Optional[str] = field(default=None)
timestamp: float = field(default_factory=_init_timestamp, compare=False, hash=False)
raw: Optional[str] = field(default=None)
raw_dict: dict = field(
repr=False, default_factory=lambda: {}, compare=False, hash=False
@ -694,11 +694,13 @@ class UnknownPacket:
path: List[str] = field(default_factory=list, compare=False, hash=False)
packet_type: Optional[str] = field(default=None)
via: Optional[str] = field(default=None, compare=False, hash=False)
# Was the packet previously processed (for dupe checking)
processed: bool = field(repr=False, default=False, compare=False, hash=False)
@property
def key(self) -> str:
"""Build a key for finding this packet in a dict."""
return f"{self.from_call}:{self.packet_type}:{self.to_call}"
return f'{self.from_call}:{self.packet_type}:{self.to_call}'
@property
def human_info(self) -> str:
@ -725,20 +727,20 @@ TYPE_LOOKUP: dict[str, type[Packet]] = {
def get_packet_type(packet: dict) -> str:
"""Decode the packet type from the packet."""
pkt_format = packet.get("format")
msg_response = packet.get("response")
pkt_format = packet.get('format')
msg_response = packet.get('response')
packet_type = PACKET_TYPE_UNKNOWN
if pkt_format == "message" and msg_response == "ack":
if pkt_format == 'message' and msg_response == 'ack':
packet_type = PACKET_TYPE_ACK
elif pkt_format == "message" and msg_response == "rej":
elif pkt_format == 'message' and msg_response == 'rej':
packet_type = PACKET_TYPE_REJECT
elif pkt_format == "message":
elif pkt_format == 'message':
packet_type = PACKET_TYPE_MESSAGE
elif pkt_format == "mic-e":
elif pkt_format == 'mic-e':
packet_type = PACKET_TYPE_MICE
elif pkt_format == "object":
elif pkt_format == 'object':
packet_type = PACKET_TYPE_OBJECT
elif pkt_format == "status":
elif pkt_format == 'status':
packet_type = PACKET_TYPE_STATUS
elif pkt_format == PACKET_TYPE_BULLETIN:
packet_type = PACKET_TYPE_BULLETIN
@ -749,13 +751,13 @@ def get_packet_type(packet: dict) -> str:
elif pkt_format == PACKET_TYPE_WX:
packet_type = PACKET_TYPE_WEATHER
elif pkt_format == PACKET_TYPE_UNCOMPRESSED:
if packet.get("symbol") == "_":
if packet.get('symbol') == '_':
packet_type = PACKET_TYPE_WEATHER
elif pkt_format == PACKET_TYPE_THIRDPARTY:
packet_type = PACKET_TYPE_THIRDPARTY
if packet_type == PACKET_TYPE_UNKNOWN:
if "latitude" in packet:
if 'latitude' in packet:
packet_type = PACKET_TYPE_BEACON
else:
packet_type = PACKET_TYPE_UNKNOWN
@ -777,32 +779,32 @@ def is_mice_packet(packet: dict[Any, Any]) -> bool:
def factory(raw_packet: dict[Any, Any]) -> type[Packet]:
"""Factory method to create a packet from a raw packet string."""
raw = raw_packet
if "_type" in raw:
cls = globals()[raw["_type"]]
if '_type' in raw:
cls = globals()[raw['_type']]
return cls.from_dict(raw)
raw["raw_dict"] = raw.copy()
raw['raw_dict'] = raw.copy()
raw = _translate_fields(raw)
packet_type = get_packet_type(raw)
raw["packet_type"] = packet_type
raw['packet_type'] = packet_type
packet_class = TYPE_LOOKUP[packet_type]
if packet_type == PACKET_TYPE_WX:
# the weather information is in a dict
# this brings those values out to the outer dict
packet_class = WeatherPacket
elif packet_type == PACKET_TYPE_OBJECT and "weather" in raw:
elif packet_type == PACKET_TYPE_OBJECT and 'weather' in raw:
packet_class = WeatherPacket
elif packet_type == PACKET_TYPE_UNKNOWN:
# Try and figure it out here
if "latitude" in raw:
if 'latitude' in raw:
packet_class = GPSPacket
else:
# LOG.warning(raw)
packet_class = UnknownPacket
raw.get("addresse", raw.get("to_call"))
raw.get('addresse', raw.get('to_call'))
# TODO: Find a global way to enable/disable this
# LOGU.opt(colors=True).info(

58
aprsd/packets/filter.py Normal file
View File

@ -0,0 +1,58 @@
import logging
from typing import Callable, Protocol, runtime_checkable, Union, Dict
from aprsd.packets import core
from aprsd.utils import singleton
LOG = logging.getLogger("APRSD")
@runtime_checkable
class PacketFilterProtocol(Protocol):
"""Protocol API for a packet filter class.
"""
def filter(self, packet: type[core.Packet]) -> Union[type[core.Packet], None]:
"""When we get a packet from the network.
Return a Packet object if the filter passes. Return None if the
Packet is filtered out.
"""
...
@singleton
class PacketFilter:
def __init__(self):
self.filters: Dict[str, Callable] = {}
def register(self, packet_filter: Callable) -> None:
if not isinstance(packet_filter, PacketFilterProtocol):
raise TypeError(f"class {packet_filter} is not a PacketFilterProtocol object")
if packet_filter not in self.filters:
self.filters[packet_filter] = packet_filter()
def unregister(self, packet_filter: Callable) -> None:
if not isinstance(packet_filter, PacketFilterProtocol):
raise TypeError(f"class {packet_filter} is not a PacketFilterProtocol object")
if packet_filter in self.filters:
del self.filters[packet_filter]
def filter(self, packet: type[core.Packet]) -> Union[type[core.Packet], None]:
"""Run through each of the filters.
This will step through each registered filter class
and call filter on it.
If the filter object returns None, we are done filtering.
If the filter object returns the packet, we continue filtering.
"""
for packet_filter in self.filters:
try:
if not self.filters[packet_filter].filter(packet):
LOG.debug(f"{self.filters[packet_filter].__class__.__name__} dropped {packet.__class__.__name__}:{packet.human_info}")
return None
except Exception as ex:
LOG.error(f"{packet_filter.__clas__.__name__} failed filtering packet {packet.__class__.__name__} : {ex}")
return packet

View File

View File

@ -0,0 +1,68 @@
import logging
from typing import Union
from oslo_config import cfg
from aprsd import packets
from aprsd.packets import core
CONF = cfg.CONF
LOG = logging.getLogger('APRSD')
class DupePacketFilter:
"""This is a packet filter to detect duplicate packets.
This Uses the PacketList object to see if a packet exists
already. If it does exist in the PacketList, then we need to
check the flag on the packet to see if it's been processed before.
If the packet has been processed already within the allowed
timeframe, then it's a dupe.
"""
def filter(self, packet: type[core.Packet]) -> Union[type[core.Packet], None]:
# LOG.debug(f"{self.__class__.__name__}.filter called for packet {packet}")
"""Filter a packet out if it's already been seen and processed."""
if isinstance(packet, core.AckPacket):
# We don't need to drop AckPackets, those should be
# processed.
# Send the AckPacket to the queue for processing elsewhere.
return packet
else:
# Make sure we aren't re-processing the same packet
# For RF based APRS Clients we can get duplicate packets
# So we need to track them and not process the dupes.
pkt_list = packets.PacketList()
found = False
try:
# Find the packet in the list of already seen packets
# Based on the packet.key
found = pkt_list.find(packet)
if not packet.msgNo:
# If the packet doesn't have a message id
# then there is no reliable way to detect
# if it's a dupe, so we just pass it on.
# it shouldn't get acked either.
found = False
except KeyError:
found = False
if not found:
# We haven't seen this packet before, so we process it.
return packet
if not packet.processed:
# We haven't processed this packet through the plugins.
return packet
elif packet.timestamp - found.timestamp < CONF.packet_dupe_timeout:
# If the packet came in within N seconds of the
# Last time seeing the packet, then we drop it as a dupe.
LOG.warning(
f'Packet {packet.from_call}:{packet.msgNo} already tracked, dropping.'
)
else:
LOG.warning(
f'Packet {packet.from_call}:{packet.msgNo} already tracked '
f'but older than {CONF.packet_dupe_timeout} seconds. processing.',
)
return packet

View File

@ -0,0 +1,53 @@
import logging
from typing import Union
from oslo_config import cfg
from aprsd import packets
from aprsd.packets import core
from aprsd.utils import singleton
CONF = cfg.CONF
LOG = logging.getLogger('APRSD')
@singleton
class PacketTypeFilter:
"""This filter is used to filter out packets that don't match a specific type.
To use this, register it with the PacketFilter class,
then instante it and call set_allow_list() with a list of packet types
you want to allow to pass the filtering. All other packets will be
filtered out.
"""
filters = {
packets.Packet.__name__: packets.Packet,
packets.AckPacket.__name__: packets.AckPacket,
packets.BeaconPacket.__name__: packets.BeaconPacket,
packets.GPSPacket.__name__: packets.GPSPacket,
packets.MessagePacket.__name__: packets.MessagePacket,
packets.MicEPacket.__name__: packets.MicEPacket,
packets.ObjectPacket.__name__: packets.ObjectPacket,
packets.StatusPacket.__name__: packets.StatusPacket,
packets.ThirdPartyPacket.__name__: packets.ThirdPartyPacket,
packets.WeatherPacket.__name__: packets.WeatherPacket,
packets.UnknownPacket.__name__: packets.UnknownPacket,
}
allow_list = ()
def set_allow_list(self, filter_list):
tmp_list = []
for filter in filter_list:
LOG.warning(
f'Setting filter {filter} : {self.filters[filter]} to tmp {tmp_list}'
)
tmp_list.append(self.filters[filter])
self.allow_list = tuple(tmp_list)
def filter(self, packet: type[core.Packet]) -> Union[type[core.Packet], None]:
"""Only allow packets of certain types to filter through."""
if self.allow_list:
if isinstance(packet, self.allow_list):
return packet

View File

@ -12,13 +12,13 @@ LOG = logging.getLogger()
LOGU = logger
CONF = cfg.CONF
FROM_COLOR = "fg #C70039"
TO_COLOR = "fg #D033FF"
TX_COLOR = "red"
RX_COLOR = "green"
PACKET_COLOR = "cyan"
DISTANCE_COLOR = "fg #FF5733"
DEGREES_COLOR = "fg #FFA900"
FROM_COLOR = 'fg #C70039'
TO_COLOR = 'fg #D033FF'
TX_COLOR = 'red'
RX_COLOR = 'green'
PACKET_COLOR = 'cyan'
DISTANCE_COLOR = 'fg #FF5733'
DEGREES_COLOR = 'fg #FFA900'
def log_multiline(
@ -27,11 +27,11 @@ def log_multiline(
"""LOG a packet to the logfile."""
if not CONF.enable_packet_logging:
return
if CONF.log_packet_format == "compact":
if CONF.log_packet_format == 'compact':
return
# asdict(packet)
logit = ["\n"]
logit = ['\n']
name = packet.__class__.__name__
if isinstance(packet, AckPacket):
@ -41,57 +41,67 @@ def log_multiline(
if header:
if tx:
header_str = f"<{TX_COLOR}>TX</{TX_COLOR}>"
header_str = f'<{TX_COLOR}>TX</{TX_COLOR}>'
logit.append(
f"{header_str}________(<{PACKET_COLOR}>{name}</{PACKET_COLOR}> "
f"TX:{packet.send_count + 1} of {pkt_max_send_count}",
f'{header_str}________(<{PACKET_COLOR}>{name}</{PACKET_COLOR}> '
f'TX:{packet.send_count + 1} of {pkt_max_send_count}',
)
else:
header_str = f"<{RX_COLOR}>RX</{RX_COLOR}>"
header_str = f'<{RX_COLOR}>RX</{RX_COLOR}>'
logit.append(
f"{header_str}________(<{PACKET_COLOR}>{name}</{PACKET_COLOR}>)",
f'{header_str}________(<{PACKET_COLOR}>{name}</{PACKET_COLOR}>)',
)
else:
header_str = ""
logit.append(f"__________(<{PACKET_COLOR}>{name}</{PACKET_COLOR}>)")
header_str = ''
logit.append(f'__________(<{PACKET_COLOR}>{name}</{PACKET_COLOR}>)')
# log_list.append(f" Packet : {packet.__class__.__name__}")
if packet.msgNo:
logit.append(f" Msg # : {packet.msgNo}")
logit.append(f' Msg # : {packet.msgNo}')
if packet.from_call:
logit.append(f" From : <{FROM_COLOR}>{packet.from_call}</{FROM_COLOR}>")
logit.append(f' From : <{FROM_COLOR}>{packet.from_call}</{FROM_COLOR}>')
if packet.to_call:
logit.append(f" To : <{TO_COLOR}>{packet.to_call}</{TO_COLOR}>")
if hasattr(packet, "path") and packet.path:
logit.append(f" Path : {'=>'.join(packet.path)}")
if hasattr(packet, "via") and packet.via:
logit.append(f" VIA : {packet.via}")
logit.append(f' To : <{TO_COLOR}>{packet.to_call}</{TO_COLOR}>')
if hasattr(packet, 'path') and packet.path:
logit.append(f' Path : {"=>".join(packet.path)}')
if hasattr(packet, 'via') and packet.via:
logit.append(f' VIA : {packet.via}')
if not isinstance(packet, AckPacket) and not isinstance(packet, RejectPacket):
msg = packet.human_info
if msg:
msg = msg.replace("<", "\\<")
logit.append(f" Info : <light-yellow><b>{msg}</b></light-yellow>")
msg = msg.replace('<', '\\<')
logit.append(f' Info : <light-yellow><b>{msg}</b></light-yellow>')
if hasattr(packet, "comment") and packet.comment:
logit.append(f" Comment : {packet.comment}")
if hasattr(packet, 'comment') and packet.comment:
logit.append(f' Comment : {packet.comment}')
raw = packet.raw.replace("<", "\\<")
logit.append(f" Raw : <fg #828282>{raw}</fg #828282>")
logit.append(f"{header_str}________(<{PACKET_COLOR}>{name}</{PACKET_COLOR}>)")
raw = packet.raw.replace('<', '\\<')
logit.append(f' Raw : <fg #828282>{raw}</fg #828282>')
logit.append(f'{header_str}________(<{PACKET_COLOR}>{name}</{PACKET_COLOR}>)')
LOGU.opt(colors=True).info("\n".join(logit))
LOGU.opt(colors=True).info('\n'.join(logit))
LOG.debug(repr(packet))
def log(packet, tx: Optional[bool] = False, header: Optional[bool] = True) -> None:
def log(
packet,
tx: Optional[bool] = False,
header: Optional[bool] = True,
packet_count: Optional[int] = None,
) -> None:
if not CONF.enable_packet_logging:
return
if CONF.log_packet_format == "multiline":
if CONF.log_packet_format == 'multiline':
log_multiline(packet, tx, header)
return
if not packet_count:
packet_count = ''
else:
packet_count = f'({packet_count:d})'
logit = []
name = packet.__class__.__name__
if isinstance(packet, AckPacket):
@ -101,47 +111,47 @@ def log(packet, tx: Optional[bool] = False, header: Optional[bool] = True) -> No
if header:
if tx:
via_color = "red"
arrow = f"<{via_color}>\u2192</{via_color}>"
via_color = 'red'
arrow = f'<{via_color}>\u2192</{via_color}>'
logit.append(
f"<red>TX\u2191</red> "
f"<cyan>{name}</cyan>"
f":{packet.msgNo}"
f" ({packet.send_count + 1} of {pkt_max_send_count})",
f'<red>TX{packet_count}\u2191</red> '
f'<cyan>{name}</cyan>'
f':{packet.msgNo}'
f' ({packet.send_count + 1} of {pkt_max_send_count})',
)
else:
via_color = "fg #1AA730"
arrow = f"<{via_color}>\u2192</{via_color}>"
f"<{via_color}><-</{via_color}>"
via_color = 'fg #1AA730'
arrow = f'<{via_color}>\u2192</{via_color}>'
f'<{via_color}><-</{via_color}>'
logit.append(
f"<fg #1AA730>RX\u2193</fg #1AA730> "
f"<cyan>{name}</cyan>"
f":{packet.msgNo}",
f'<fg #1AA730>RX{packet_count}\u2193</fg #1AA730> '
f'<cyan>{name}</cyan>'
f':{packet.msgNo}',
)
else:
via_color = "green"
arrow = f"<{via_color}>-></{via_color}>"
via_color = 'green'
arrow = f'<{via_color}>-></{via_color}>'
logit.append(
f"<cyan>{name}</cyan>" f":{packet.msgNo}",
f'<cyan>{name}</cyan>:{packet.msgNo}',
)
tmp = None
if packet.path:
tmp = f"{arrow}".join(packet.path) + f"{arrow} "
tmp = f'{arrow}'.join(packet.path) + f'{arrow} '
logit.append(
f"<{FROM_COLOR}>{packet.from_call}</{FROM_COLOR}> {arrow}"
f"{tmp if tmp else ' '}"
f"<{TO_COLOR}>{packet.to_call}</{TO_COLOR}>",
f'<{FROM_COLOR}>{packet.from_call}</{FROM_COLOR}> {arrow}'
f'{tmp if tmp else " "}'
f'<{TO_COLOR}>{packet.to_call}</{TO_COLOR}>',
)
if not isinstance(packet, AckPacket) and not isinstance(packet, RejectPacket):
logit.append(":")
logit.append(':')
msg = packet.human_info
if msg:
msg = msg.replace("<", "\\<")
logit.append(f"<light-yellow><b>{msg}</b></light-yellow>")
msg = msg.replace('<', '\\<')
logit.append(f'<light-yellow><b>{msg}</b></light-yellow>')
# is there distance information?
if isinstance(packet, GPSPacket) and CONF.latitude and CONF.longitude:
@ -150,12 +160,12 @@ def log(packet, tx: Optional[bool] = False, header: Optional[bool] = True) -> No
try:
bearing = utils.calculate_initial_compass_bearing(my_coords, packet_coords)
except Exception as e:
LOG.error(f"Failed to calculate bearing: {e}")
LOG.error(f'Failed to calculate bearing: {e}')
bearing = 0
logit.append(
f" : <{DEGREES_COLOR}>{utils.degrees_to_cardinal(bearing, full_string=True)}</{DEGREES_COLOR}>"
f"<{DISTANCE_COLOR}>@{haversine(my_coords, packet_coords, unit=Unit.MILES):.2f}miles</{DISTANCE_COLOR}>",
f' : <{DEGREES_COLOR}>{utils.degrees_to_cardinal(bearing, full_string=True)}</{DEGREES_COLOR}>'
f'<{DISTANCE_COLOR}>@{haversine(my_coords, packet_coords, unit=Unit.MILES):.2f}miles</{DISTANCE_COLOR}>',
)
LOGU.opt(colors=True).info(" ".join(logit))
LOGU.opt(colors=True).info(' '.join(logit))
log_multiline(packet, tx, header)

View File

@ -7,7 +7,7 @@ from aprsd.packets import core
from aprsd.utils import objectstore
CONF = cfg.CONF
LOG = logging.getLogger("APRSD")
LOG = logging.getLogger('APRSD')
class PacketList(objectstore.ObjectStoreMixin):
@ -27,8 +27,8 @@ class PacketList(objectstore.ObjectStoreMixin):
def _init_data(self):
self.data = {
"types": {},
"packets": OrderedDict(),
'types': {},
'packets': OrderedDict(),
}
def rx(self, packet: type[core.Packet]):
@ -37,11 +37,11 @@ class PacketList(objectstore.ObjectStoreMixin):
self._total_rx += 1
self._add(packet)
ptype = packet.__class__.__name__
type_stats = self.data["types"].setdefault(
type_stats = self.data['types'].setdefault(
ptype,
{"tx": 0, "rx": 0},
{'tx': 0, 'rx': 0},
)
type_stats["rx"] += 1
type_stats['rx'] += 1
def tx(self, packet: type[core.Packet]):
"""Add a packet that was received."""
@ -49,32 +49,32 @@ class PacketList(objectstore.ObjectStoreMixin):
self._total_tx += 1
self._add(packet)
ptype = packet.__class__.__name__
type_stats = self.data["types"].setdefault(
type_stats = self.data['types'].setdefault(
ptype,
{"tx": 0, "rx": 0},
{'tx': 0, 'rx': 0},
)
type_stats["tx"] += 1
type_stats['tx'] += 1
def add(self, packet):
with self.lock:
self._add(packet)
def _add(self, packet):
if not self.data.get("packets"):
if not self.data.get('packets'):
self._init_data()
if packet.key in self.data["packets"]:
self.data["packets"].move_to_end(packet.key)
elif len(self.data["packets"]) == self.maxlen:
self.data["packets"].popitem(last=False)
self.data["packets"][packet.key] = packet
if packet.key in self.data['packets']:
self.data['packets'].move_to_end(packet.key)
elif len(self.data['packets']) == self.maxlen:
self.data['packets'].popitem(last=False)
self.data['packets'][packet.key] = packet
def find(self, packet):
with self.lock:
return self.data["packets"][packet.key]
return self.data['packets'][packet.key]
def __len__(self):
with self.lock:
return len(self.data["packets"])
return len(self.data['packets'])
def total_rx(self):
with self.lock:
@ -87,17 +87,23 @@ class PacketList(objectstore.ObjectStoreMixin):
def stats(self, serializable=False) -> dict:
with self.lock:
# Get last N packets directly using list slicing
packets_list = list(self.data.get("packets", {}).values())
pkts = packets_list[-CONF.packet_list_stats_maxlen :][::-1]
if CONF.packet_list_stats_maxlen >= 0:
packets_list = list(self.data.get('packets', {}).values())
pkts = packets_list[-CONF.packet_list_stats_maxlen :][::-1]
else:
# We have to copy here, because this get() results in a pointer
# to the packets internally here, which can change after this
# function returns, which would cause a problem trying to save
# the stats to disk.
pkts = self.data.get('packets', {}).copy()
stats = {
"total_tracked": self._total_rx
'total_tracked': self._total_rx
+ self._total_tx, # Fixed typo: was rx + rx
"rx": self._total_rx,
"tx": self._total_tx,
"types": self.data.get("types", {}), # Changed default from [] to {}
"packet_count": len(self.data.get("packets", [])),
"maxlen": self.maxlen,
"packets": pkts,
'rx': self._total_rx,
'tx': self._total_tx,
'types': self.data.get('types', {}), # Changed default from [] to {}
'packet_count': len(self.data.get('packets', [])),
'maxlen': self.maxlen,
'packets': pkts,
}
return stats

View File

@ -12,29 +12,30 @@ import pluggy
from oslo_config import cfg
import aprsd
from aprsd import client, packets, threads
from aprsd import packets, threads
from aprsd.client.client import APRSDClient
from aprsd.packets import watch_list
# setup the global logger
CONF = cfg.CONF
LOG = logging.getLogger("APRSD")
LOG = logging.getLogger('APRSD')
CORE_MESSAGE_PLUGINS = [
"aprsd.plugins.email.EmailPlugin",
"aprsd.plugins.fortune.FortunePlugin",
"aprsd.plugins.location.LocationPlugin",
"aprsd.plugins.ping.PingPlugin",
"aprsd.plugins.time.TimePlugin",
"aprsd.plugins.weather.USWeatherPlugin",
"aprsd.plugins.version.VersionPlugin",
'aprsd.plugins.email.EmailPlugin',
'aprsd.plugins.fortune.FortunePlugin',
'aprsd.plugins.location.LocationPlugin',
'aprsd.plugins.ping.PingPlugin',
'aprsd.plugins.time.TimePlugin',
'aprsd.plugins.weather.USWeatherPlugin',
'aprsd.plugins.version.VersionPlugin',
]
CORE_NOTIFY_PLUGINS = [
"aprsd.plugins.notify.NotifySeenPlugin",
'aprsd.plugins.notify.NotifySeenPlugin',
]
hookspec = pluggy.HookspecMarker("aprsd")
hookimpl = pluggy.HookimplMarker("aprsd")
hookspec = pluggy.HookspecMarker('aprsd')
hookimpl = pluggy.HookimplMarker('aprsd')
class APRSDPluginSpec:
@ -76,14 +77,14 @@ class APRSDPluginBase(metaclass=abc.ABCMeta):
else:
LOG.error(
"Can't start thread {}:{}, Must be a child "
"of aprsd.threads.APRSDThread".format(
'of aprsd.threads.APRSDThread'.format(
self,
thread,
),
)
except Exception:
LOG.error(
"Failed to start threads for plugin {}".format(
'Failed to start threads for plugin {}'.format(
self,
),
)
@ -93,7 +94,7 @@ class APRSDPluginBase(metaclass=abc.ABCMeta):
return self.message_counter
def help(self) -> str:
return "Help!"
return 'Help!'
@abc.abstractmethod
def setup(self):
@ -146,11 +147,11 @@ class APRSDWatchListPluginBase(APRSDPluginBase, metaclass=abc.ABCMeta):
watch_list = CONF.watch_list.callsigns
# make sure the timeout is set or this doesn't work
if watch_list:
aprs_client = client.client_factory.create().client
filter_str = "b/{}".format("/".join(watch_list))
aprs_client = APRSDClient()
filter_str = 'b/{}'.format('/'.join(watch_list))
aprs_client.set_filter(filter_str)
else:
LOG.warning("Watch list enabled, but no callsigns set.")
LOG.warning('Watch list enabled, but no callsigns set.')
@hookimpl
def filter(self, packet: type[packets.Packet]) -> str | packets.MessagePacket:
@ -164,7 +165,7 @@ class APRSDWatchListPluginBase(APRSDPluginBase, metaclass=abc.ABCMeta):
result = self.process(packet)
except Exception as ex:
LOG.error(
"Plugin {} failed to process packet {}".format(
'Plugin {} failed to process packet {}'.format(
self.__class__,
ex,
),
@ -172,7 +173,7 @@ class APRSDWatchListPluginBase(APRSDPluginBase, metaclass=abc.ABCMeta):
if result:
self.tx_inc()
else:
LOG.warning(f"{self.__class__} plugin is not enabled")
LOG.warning(f'{self.__class__} plugin is not enabled')
return result
@ -196,7 +197,7 @@ class APRSDRegexCommandPluginBase(APRSDPluginBase, metaclass=abc.ABCMeta):
raise NotImplementedError
def help(self):
return "{}: {}".format(
return '{}: {}'.format(
self.command_name.lower(),
self.command_regex,
)
@ -207,7 +208,7 @@ class APRSDRegexCommandPluginBase(APRSDPluginBase, metaclass=abc.ABCMeta):
@hookimpl
def filter(self, packet: packets.MessagePacket) -> str | packets.MessagePacket:
LOG.debug(f"{self.__class__.__name__} called")
LOG.debug(f'{self.__class__.__name__} called')
if not self.enabled:
result = f"{self.__class__.__name__} isn't enabled"
LOG.warning(result)
@ -215,7 +216,7 @@ class APRSDRegexCommandPluginBase(APRSDPluginBase, metaclass=abc.ABCMeta):
if not isinstance(packet, packets.MessagePacket):
LOG.warning(
f"{self.__class__.__name__} Got a {packet.__class__.__name__} ignoring"
f'{self.__class__.__name__} Got a {packet.__class__.__name__} ignoring'
)
return packets.NULL_MESSAGE
@ -237,7 +238,7 @@ class APRSDRegexCommandPluginBase(APRSDPluginBase, metaclass=abc.ABCMeta):
result = self.process(packet)
except Exception as ex:
LOG.error(
"Plugin {} failed to process packet {}".format(
'Plugin {} failed to process packet {}'.format(
self.__class__,
ex,
),
@ -254,7 +255,7 @@ class APRSFIKEYMixin:
def ensure_aprs_fi_key(self):
if not CONF.aprs_fi.apiKey:
LOG.error("Config aprs_fi.apiKey is not set")
LOG.error('Config aprs_fi.apiKey is not set')
self.enabled = False
else:
self.enabled = True
@ -266,25 +267,25 @@ class HelpPlugin(APRSDRegexCommandPluginBase):
This plugin is in this file to prevent a circular import.
"""
command_regex = "^[hH]"
command_name = "help"
command_regex = '^[hH]'
command_name = 'help'
def help(self):
return "Help: send APRS help or help <plugin>"
return 'Help: send APRS help or help <plugin>'
def process(self, packet: packets.MessagePacket):
LOG.info("HelpPlugin")
LOG.info('HelpPlugin')
# fromcall = packet.get("from")
message = packet.message_text
# ack = packet.get("msgNo", "0")
a = re.search(r"^.*\s+(.*)", message)
a = re.search(r'^.*\s+(.*)', message)
command_name = None
if a is not None:
command_name = a.group(1).lower()
pm = PluginManager()
if command_name and "?" not in command_name:
if command_name and '?' not in command_name:
# user wants help for a specific plugin
reply = None
for p in pm.get_plugins():
@ -303,20 +304,20 @@ class HelpPlugin(APRSDRegexCommandPluginBase):
LOG.debug(p)
if p.enabled and isinstance(p, APRSDRegexCommandPluginBase):
name = p.command_name.lower()
if name not in list and "help" not in name:
if name not in list and 'help' not in name:
list.append(name)
list.sort()
reply = " ".join(list)
reply = ' '.join(list)
lines = textwrap.wrap(reply, 60)
replies = ["Send APRS MSG of 'help' or 'help <plugin>'"]
for line in lines:
replies.append(f"plugins: {line}")
replies.append(f'plugins: {line}')
for entry in replies:
LOG.debug(f"{len(entry)} {entry}")
LOG.debug(f'{len(entry)} {entry}')
LOG.debug(f"{replies}")
LOG.debug(f'{replies}')
return replies
@ -341,17 +342,17 @@ class PluginManager:
return cls._instance
def _init(self):
self._pluggy_pm = pluggy.PluginManager("aprsd")
self._pluggy_pm = pluggy.PluginManager('aprsd')
self._pluggy_pm.add_hookspecs(APRSDPluginSpec)
# For the watchlist plugins
self._watchlist_pm = pluggy.PluginManager("aprsd")
self._watchlist_pm = pluggy.PluginManager('aprsd')
self._watchlist_pm.add_hookspecs(APRSDPluginSpec)
def stats(self, serializable=False) -> dict:
"""Collect and return stats for all plugins."""
def full_name_with_qualname(obj):
return "{}.{}".format(
return '{}.{}'.format(
obj.__class__.__module__,
obj.__class__.__qualname__,
)
@ -361,10 +362,10 @@ class PluginManager:
if plugins:
for p in plugins:
plugin_stats[full_name_with_qualname(p)] = {
"enabled": p.enabled,
"rx": p.rx_count,
"tx": p.tx_count,
"version": p.version,
'enabled': p.enabled,
'rx': p.rx_count,
'tx': p.tx_count,
'version': p.version,
}
return plugin_stats
@ -392,19 +393,19 @@ class PluginManager:
module_name = None
class_name = None
try:
module_name, class_name = module_class_string.rsplit(".", 1)
module_name, class_name = module_class_string.rsplit('.', 1)
module = importlib.import_module(module_name)
# Commented out because the email thread starts in a different context
# and hence gives a different singleton for the EmailStats
# module = importlib.reload(module)
except Exception as ex:
if not module_name:
LOG.error(f"Failed to load Plugin {module_class_string}")
LOG.error(f'Failed to load Plugin {module_class_string}')
else:
LOG.error(f"Failed to load Plugin '{module_name}' : '{ex}'")
return
assert hasattr(module, class_name), "class {} is not in {}".format(
assert hasattr(module, class_name), 'class {} is not in {}'.format(
class_name,
module_name,
)
@ -412,7 +413,7 @@ class PluginManager:
# class_name, module_name))
cls = getattr(module, class_name)
if super_cls is not None:
assert issubclass(cls, super_cls), "class {} should inherit from {}".format(
assert issubclass(cls, super_cls), 'class {} should inherit from {}'.format(
class_name,
super_cls.__name__,
)
@ -444,7 +445,7 @@ class PluginManager:
self._watchlist_pm.register(plugin_obj)
else:
LOG.warning(
f"Plugin {plugin_obj.__class__.__name__} is disabled"
f'Plugin {plugin_obj.__class__.__name__} is disabled'
)
elif isinstance(plugin_obj, APRSDRegexCommandPluginBase):
if plugin_obj.enabled:
@ -458,7 +459,7 @@ class PluginManager:
self._pluggy_pm.register(plugin_obj)
else:
LOG.warning(
f"Plugin {plugin_obj.__class__.__name__} is disabled"
f'Plugin {plugin_obj.__class__.__name__} is disabled'
)
elif isinstance(plugin_obj, APRSDPluginBase):
if plugin_obj.enabled:
@ -471,7 +472,7 @@ class PluginManager:
self._pluggy_pm.register(plugin_obj)
else:
LOG.warning(
f"Plugin {plugin_obj.__class__.__name__} is disabled"
f'Plugin {plugin_obj.__class__.__name__} is disabled'
)
except Exception as ex:
LOG.error(f"Couldn't load plugin '{plugin_name}'")
@ -485,11 +486,11 @@ class PluginManager:
def setup_plugins(
self,
load_help_plugin=True,
plugin_list=[],
plugin_list=None,
):
"""Create the plugin manager and register plugins."""
LOG.info("Loading APRSD Plugins")
LOG.info('Loading APRSD Plugins')
# Help plugin is always enabled.
if load_help_plugin:
_help = HelpPlugin()
@ -509,7 +510,7 @@ class PluginManager:
for p_name in CORE_MESSAGE_PLUGINS:
self._load_plugin(p_name)
LOG.info("Completed Plugin Loading.")
LOG.info('Completed Plugin Loading.')
def run(self, packet: packets.MessagePacket):
"""Execute all the plugins run method."""
@ -524,7 +525,7 @@ class PluginManager:
"""Stop all threads created by all plugins."""
with self.lock:
for p in self.get_plugins():
if hasattr(p, "stop_threads"):
if hasattr(p, 'stop_threads'):
p.stop_threads()
def register_msg(self, obj):

View File

@ -4,21 +4,20 @@ import logging
import requests
LOG = logging.getLogger("APRSD")
LOG = logging.getLogger('APRSD')
def get_aprs_fi(api_key, callsign):
LOG.debug(f"Fetch aprs.fi location for '{callsign}'")
try:
url = (
"http://api.aprs.fi/api/get?"
"&what=loc&apikey={}&format=json"
"&name={}".format(api_key, callsign)
'http://api.aprs.fi/api/get?&what=loc&apikey={}&format=json&name={}'.format(
api_key, callsign
)
)
response = requests.get(url)
except Exception:
raise Exception("Failed to get aprs.fi location")
except Exception as e:
raise Exception('Failed to get aprs.fi location') from e
else:
response.raise_for_status()
return json.loads(response.text)
@ -26,22 +25,22 @@ def get_aprs_fi(api_key, callsign):
def get_weather_gov_for_gps(lat, lon):
# FIXME(hemna) This is currently BROKEN
LOG.debug(f"Fetch station at {lat}, {lon}")
LOG.debug(f'Fetch station at {lat}, {lon}')
headers = requests.utils.default_headers()
headers.update(
{"User-Agent": "(aprsd, waboring@hemna.com)"},
{'User-Agent': '(aprsd, waboring@hemna.com)'},
)
try:
url2 = (
"https://forecast.weather.gov/MapClick.php?lat=%s"
"&lon=%s&FcstType=json" % (lat, lon)
'https://forecast.weather.gov/MapClick.php?lat=%s'
'&lon=%s&FcstType=json' % (lat, lon)
# f"https://api.weather.gov/points/{lat},{lon}"
)
LOG.debug(f"Fetching weather '{url2}'")
response = requests.get(url2, headers=headers)
except Exception as e:
LOG.error(e)
raise Exception("Failed to get weather")
raise Exception('Failed to get weather') from e
else:
response.raise_for_status()
return json.loads(response.text)
@ -50,24 +49,24 @@ def get_weather_gov_for_gps(lat, lon):
def get_weather_gov_metar(station):
LOG.debug(f"Fetch metar for station '{station}'")
try:
url = "https://api.weather.gov/stations/{}/observations/latest".format(
url = 'https://api.weather.gov/stations/{}/observations/latest'.format(
station,
)
response = requests.get(url)
except Exception:
raise Exception("Failed to fetch metar")
except Exception as e:
raise Exception('Failed to fetch metar') from e
else:
response.raise_for_status()
return json.loads(response)
def fetch_openweathermap(api_key, lat, lon, units="metric", exclude=None):
LOG.debug(f"Fetch openweathermap for {lat}, {lon}")
def fetch_openweathermap(api_key, lat, lon, units='metric', exclude=None):
LOG.debug(f'Fetch openweathermap for {lat}, {lon}')
if not exclude:
exclude = "minutely,hourly,daily,alerts"
exclude = 'minutely,hourly,daily,alerts'
try:
url = (
"https://api.openweathermap.org/data/2.5/onecall?"
"https://api.openweathermap.org/data/3.0/onecall?"
"lat={}&lon={}&appid={}&units={}&exclude={}".format(
lat,
lon,
@ -80,7 +79,7 @@ def fetch_openweathermap(api_key, lat, lon, units="metric", exclude=None):
response = requests.get(url)
except Exception as e:
LOG.error(e)
raise Exception("Failed to get weather")
raise Exception('Failed to get weather') from e
else:
response.raise_for_status()
return json.loads(response.text)

View File

@ -9,7 +9,7 @@ from aprsd import plugin, plugin_utils
from aprsd.utils import trace
CONF = cfg.CONF
LOG = logging.getLogger("APRSD")
LOG = logging.getLogger('APRSD')
class USWeatherPlugin(plugin.APRSDRegexCommandPluginBase, plugin.APRSFIKEYMixin):
@ -26,22 +26,22 @@ class USWeatherPlugin(plugin.APRSDRegexCommandPluginBase, plugin.APRSFIKEYMixin)
"""
# command_regex = r"^([w][x]|[w][x]\s|weather)"
command_regex = r"^[wW]"
command_regex = r'^[wW]'
command_name = "USWeather"
short_description = "Provide USA only weather of GPS Beacon location"
command_name = 'USWeather'
short_description = 'Provide USA only weather of GPS Beacon location'
def setup(self):
self.ensure_aprs_fi_key()
@trace.trace
def process(self, packet):
LOG.info("Weather Plugin")
LOG.info('Weather Plugin')
fromcall = packet.from_call
message = packet.get("message_text", None)
message = packet.get('message_text', None)
# message = packet.get("message_text", None)
# ack = packet.get("msgNo", "0")
a = re.search(r"^.*\s+(.*)", message)
a = re.search(r'^.*\s+(.*)', message)
if a is not None:
searchcall = a.group(1)
searchcall = searchcall.upper()
@ -51,34 +51,34 @@ class USWeatherPlugin(plugin.APRSDRegexCommandPluginBase, plugin.APRSFIKEYMixin)
try:
aprs_data = plugin_utils.get_aprs_fi(api_key, searchcall)
except Exception as ex:
LOG.error(f"Failed to fetch aprs.fi data {ex}")
return "Failed to fetch aprs.fi location"
LOG.error(f'Failed to fetch aprs.fi data {ex}')
return 'Failed to fetch aprs.fi location'
LOG.debug(f"LocationPlugin: aprs_data = {aprs_data}")
if not len(aprs_data["entries"]):
LOG.debug(f'LocationPlugin: aprs_data = {aprs_data}')
if not len(aprs_data['entries']):
LOG.error("Didn't get any entries from aprs.fi")
return "Failed to fetch aprs.fi location"
return 'Failed to fetch aprs.fi location'
lat = aprs_data["entries"][0]["lat"]
lon = aprs_data["entries"][0]["lng"]
lat = aprs_data['entries'][0]['lat']
lon = aprs_data['entries'][0]['lng']
try:
wx_data = plugin_utils.get_weather_gov_for_gps(lat, lon)
except Exception as ex:
LOG.error(f"Couldn't fetch forecast.weather.gov '{ex}'")
return "Unable to get weather"
return 'Unable to get weather'
LOG.info(f"WX data {wx_data}")
LOG.info(f'WX data {wx_data}')
reply = (
"%sF(%sF/%sF) %s. %s, %s."
'%sF(%sF/%sF) %s. %s, %s.'
% (
wx_data["currentobservation"]["Temp"],
wx_data["data"]["temperature"][0],
wx_data["data"]["temperature"][1],
wx_data["data"]["weather"][0],
wx_data["time"]["startPeriodName"][1],
wx_data["data"]["weather"][1],
wx_data['currentobservation']['Temp'],
wx_data['data']['temperature'][0],
wx_data['data']['temperature'][1],
wx_data['data']['weather'][0],
wx_data['time']['startPeriodName'][1],
wx_data['data']['weather'][1],
)
).rstrip()
LOG.debug(f"reply: '{reply}' ")
@ -100,31 +100,31 @@ class USMetarPlugin(plugin.APRSDRegexCommandPluginBase, plugin.APRSFIKEYMixin):
"""
command_regex = r"^([m]|[M]|[m]\s|metar)"
command_name = "USMetar"
short_description = "USA only METAR of GPS Beacon location"
command_regex = r'^([m]|[M]|[m]\s|metar)'
command_name = 'USMetar'
short_description = 'USA only METAR of GPS Beacon location'
def setup(self):
self.ensure_aprs_fi_key()
@trace.trace
def process(self, packet):
fromcall = packet.get("from")
message = packet.get("message_text", None)
fromcall = packet.get('from')
message = packet.get('message_text', None)
# ack = packet.get("msgNo", "0")
LOG.info(f"WX Plugin '{message}'")
a = re.search(r"^.*\s+(.*)", message)
a = re.search(r'^.*\s+(.*)', message)
if a is not None:
searchcall = a.group(1)
station = searchcall.upper()
try:
resp = plugin_utils.get_weather_gov_metar(station)
except Exception as e:
LOG.debug(f"Weather failed with: {str(e)}")
reply = "Unable to find station METAR"
LOG.debug(f'Weather failed with: {str(e)}')
reply = 'Unable to find station METAR'
else:
station_data = json.loads(resp.text)
reply = station_data["properties"]["rawMessage"]
reply = station_data['properties']['rawMessage']
return reply
else:
@ -136,36 +136,36 @@ class USMetarPlugin(plugin.APRSDRegexCommandPluginBase, plugin.APRSFIKEYMixin):
try:
aprs_data = plugin_utils.get_aprs_fi(api_key, fromcall)
except Exception as ex:
LOG.error(f"Failed to fetch aprs.fi data {ex}")
return "Failed to fetch aprs.fi location"
LOG.error(f'Failed to fetch aprs.fi data {ex}')
return 'Failed to fetch aprs.fi location'
# LOG.debug("LocationPlugin: aprs_data = {}".format(aprs_data))
if not len(aprs_data["entries"]):
LOG.error("Found no entries from aprs.fi!")
return "Failed to fetch aprs.fi location"
if not len(aprs_data['entries']):
LOG.error('Found no entries from aprs.fi!')
return 'Failed to fetch aprs.fi location'
lat = aprs_data["entries"][0]["lat"]
lon = aprs_data["entries"][0]["lng"]
lat = aprs_data['entries'][0]['lat']
lon = aprs_data['entries'][0]['lng']
try:
wx_data = plugin_utils.get_weather_gov_for_gps(lat, lon)
except Exception as ex:
LOG.error(f"Couldn't fetch forecast.weather.gov '{ex}'")
return "Unable to metar find station."
return 'Unable to metar find station.'
if wx_data["location"]["metar"]:
station = wx_data["location"]["metar"]
if wx_data['location']['metar']:
station = wx_data['location']['metar']
try:
resp = plugin_utils.get_weather_gov_metar(station)
except Exception as e:
LOG.debug(f"Weather failed with: {str(e)}")
reply = "Failed to get Metar"
LOG.debug(f'Weather failed with: {str(e)}')
reply = 'Failed to get Metar'
else:
station_data = json.loads(resp.text)
reply = station_data["properties"]["rawMessage"]
reply = station_data['properties']['rawMessage']
else:
# Couldn't find a station
reply = "No Metar station found"
reply = 'No Metar station found'
return reply
@ -190,35 +190,36 @@ class OWMWeatherPlugin(plugin.APRSDRegexCommandPluginBase):
"""
# command_regex = r"^([w][x]|[w][x]\s|weather)"
command_regex = r"^[wW]"
command_regex = r'^[wW]'
command_name = "OpenWeatherMap"
short_description = "OpenWeatherMap weather of GPS Beacon location"
command_name = 'OpenWeatherMap'
short_description = 'OpenWeatherMap weather of GPS Beacon location'
def setup(self):
if not CONF.owm_weather_plugin.apiKey:
LOG.error("Config.owm_weather_plugin.apiKey is not set. Disabling")
LOG.error('Config.owm_weather_plugin.apiKey is not set. Disabling')
self.enabled = False
else:
self.enabled = True
def help(self):
_help = [
"openweathermap: Send {} to get weather " "from your location".format(
'openweathermap: Send {} to get weather from your location'.format(
self.command_regex
),
'openweathermap: Send {} <callsign> to get weather from <callsign>'.format(
self.command_regex
),
"openweathermap: Send {} <callsign> to get "
"weather from <callsign>".format(self.command_regex),
]
return _help
@trace.trace
def process(self, packet):
fromcall = packet.get("from_call")
message = packet.get("message_text", None)
fromcall = packet.get('from_call')
message = packet.get('message_text', None)
# ack = packet.get("msgNo", "0")
LOG.info(f"OWMWeather Plugin '{message}'")
a = re.search(r"^.*\s+(.*)", message)
a = re.search(r'^.*\s+(.*)', message)
if a is not None:
searchcall = a.group(1)
searchcall = searchcall.upper()
@ -230,16 +231,16 @@ class OWMWeatherPlugin(plugin.APRSDRegexCommandPluginBase):
try:
aprs_data = plugin_utils.get_aprs_fi(api_key, searchcall)
except Exception as ex:
LOG.error(f"Failed to fetch aprs.fi data {ex}")
return "Failed to fetch location"
LOG.error(f'Failed to fetch aprs.fi data {ex}')
return 'Failed to fetch location'
# LOG.debug("LocationPlugin: aprs_data = {}".format(aprs_data))
if not len(aprs_data["entries"]):
LOG.error("Found no entries from aprs.fi!")
return "Failed to fetch location"
if not len(aprs_data['entries']):
LOG.error('Found no entries from aprs.fi!')
return 'Failed to fetch location'
lat = aprs_data["entries"][0]["lat"]
lon = aprs_data["entries"][0]["lng"]
lat = aprs_data['entries'][0]['lat']
lon = aprs_data['entries'][0]['lng']
units = CONF.units
api_key = CONF.owm_weather_plugin.apiKey
@ -249,40 +250,40 @@ class OWMWeatherPlugin(plugin.APRSDRegexCommandPluginBase):
lat,
lon,
units=units,
exclude="minutely,hourly",
exclude='minutely,hourly',
)
except Exception as ex:
LOG.error(f"Couldn't fetch openweathermap api '{ex}'")
# default to UTC
return "Unable to get weather"
return 'Unable to get weather'
if units == "metric":
degree = "C"
if units == 'metric':
degree = 'C'
else:
degree = "F"
degree = 'F'
if "wind_gust" in wx_data["current"]:
wind = "{:.0f}@{}G{:.0f}".format(
wx_data["current"]["wind_speed"],
wx_data["current"]["wind_deg"],
wx_data["current"]["wind_gust"],
if 'wind_gust' in wx_data['current']:
wind = '{:.0f}@{}G{:.0f}'.format(
wx_data['current']['wind_speed'],
wx_data['current']['wind_deg'],
wx_data['current']['wind_gust'],
)
else:
wind = "{:.0f}@{}".format(
wx_data["current"]["wind_speed"],
wx_data["current"]["wind_deg"],
wind = '{:.0f}@{}'.format(
wx_data['current']['wind_speed'],
wx_data['current']['wind_deg'],
)
# LOG.debug(wx_data["current"])
# LOG.debug(wx_data["daily"])
reply = "{} {:.1f}{}/{:.1f}{} Wind {} {}%".format(
wx_data["current"]["weather"][0]["description"],
wx_data["current"]["temp"],
reply = '{} {:.1f}{}/{:.1f}{} Wind {} {}%'.format(
wx_data['current']['weather'][0]['description'],
wx_data['current']['temp'],
degree,
wx_data["current"]["dew_point"],
wx_data['current']['dew_point'],
degree,
wind,
wx_data["current"]["humidity"],
wx_data['current']['humidity'],
)
return reply
@ -311,26 +312,26 @@ class AVWXWeatherPlugin(plugin.APRSDRegexCommandPluginBase):
docker build -f Dockerfile -t avwx-api:master .
"""
command_regex = r"^([m]|[m]|[m]\s|metar)"
command_name = "AVWXWeather"
short_description = "AVWX weather of GPS Beacon location"
command_regex = r'^([m]|[m]|[m]\s|metar)'
command_name = 'AVWXWeather'
short_description = 'AVWX weather of GPS Beacon location'
def setup(self):
if not CONF.avwx_plugin.base_url:
LOG.error("Config avwx_plugin.base_url not specified. Disabling")
LOG.error('Config avwx_plugin.base_url not specified. Disabling')
return False
elif not CONF.avwx_plugin.apiKey:
LOG.error("Config avwx_plugin.apiKey not specified. Disabling")
LOG.error('Config avwx_plugin.apiKey not specified. Disabling')
return False
else:
return True
self.enabled = True
def help(self):
_help = [
"avwxweather: Send {} to get weather " "from your location".format(
'avwxweather: Send {} to get weather from your location'.format(
self.command_regex
),
"avwxweather: Send {} <callsign> to get " "weather from <callsign>".format(
'avwxweather: Send {} <callsign> to get weather from <callsign>'.format(
self.command_regex
),
]
@ -338,11 +339,11 @@ class AVWXWeatherPlugin(plugin.APRSDRegexCommandPluginBase):
@trace.trace
def process(self, packet):
fromcall = packet.get("from")
message = packet.get("message_text", None)
fromcall = packet.get('from')
message = packet.get('message_text', None)
# ack = packet.get("msgNo", "0")
LOG.info(f"AVWXWeather Plugin '{message}'")
a = re.search(r"^.*\s+(.*)", message)
a = re.search(r'^.*\s+(.*)', message)
if a is not None:
searchcall = a.group(1)
searchcall = searchcall.upper()
@ -353,43 +354,43 @@ class AVWXWeatherPlugin(plugin.APRSDRegexCommandPluginBase):
try:
aprs_data = plugin_utils.get_aprs_fi(api_key, searchcall)
except Exception as ex:
LOG.error(f"Failed to fetch aprs.fi data {ex}")
return "Failed to fetch location"
LOG.error(f'Failed to fetch aprs.fi data {ex}')
return 'Failed to fetch location'
# LOG.debug("LocationPlugin: aprs_data = {}".format(aprs_data))
if not len(aprs_data["entries"]):
LOG.error("Found no entries from aprs.fi!")
return "Failed to fetch location"
if not len(aprs_data['entries']):
LOG.error('Found no entries from aprs.fi!')
return 'Failed to fetch location'
lat = aprs_data["entries"][0]["lat"]
lon = aprs_data["entries"][0]["lng"]
lat = aprs_data['entries'][0]['lat']
lon = aprs_data['entries'][0]['lng']
api_key = CONF.avwx_plugin.apiKey
base_url = CONF.avwx_plugin.base_url
token = f"TOKEN {api_key}"
headers = {"Authorization": token}
token = f'TOKEN {api_key}'
headers = {'Authorization': token}
try:
coord = f"{lat},{lon}"
coord = f'{lat},{lon}'
url = (
"{}/api/station/near/{}?"
"n=1&airport=false&reporting=true&format=json".format(base_url, coord)
'{}/api/station/near/{}?'
'n=1&airport=false&reporting=true&format=json'.format(base_url, coord)
)
LOG.debug(f"Get stations near me '{url}'")
response = requests.get(url, headers=headers)
except Exception as ex:
LOG.error(ex)
raise Exception(f"Failed to get the weather '{ex}'")
raise Exception(f"Failed to get the weather '{ex}'") from ex
else:
wx_data = json.loads(response.text)
# LOG.debug(wx_data)
station = wx_data[0]["station"]["icao"]
station = wx_data[0]['station']['icao']
try:
url = (
"{}/api/metar/{}?options=info,translate,summary"
"&airport=true&reporting=true&format=json&onfail=cache".format(
'{}/api/metar/{}?options=info,translate,summary'
'&airport=true&reporting=true&format=json&onfail=cache'.format(
base_url,
station,
)
@ -399,9 +400,9 @@ class AVWXWeatherPlugin(plugin.APRSDRegexCommandPluginBase):
response = requests.get(url, headers=headers)
except Exception as ex:
LOG.error(ex)
raise Exception(f"Failed to get metar {ex}")
raise Exception(f'Failed to get metar {ex}') from ex
else:
metar_data = json.loads(response.text)
# LOG.debug(metar_data)
return metar_data["raw"]
return metar_data['raw']

View File

@ -3,7 +3,7 @@ from typing import Callable, Protocol, runtime_checkable
from aprsd.utils import singleton
LOG = logging.getLogger("APRSD")
LOG = logging.getLogger('APRSD')
@runtime_checkable
@ -31,15 +31,16 @@ class Collector:
serializable=serializable
).copy()
except Exception as e:
LOG.error(f"Error in producer {name} (stats): {e}")
LOG.error(f'Error in producer {name} (stats): {e}')
raise e
return stats
def register_producer(self, producer_name: Callable):
if not isinstance(producer_name, StatsProducer):
raise TypeError(f"Producer {producer_name} is not a StatsProducer")
raise TypeError(f'Producer {producer_name} is not a StatsProducer')
self.producers.append(producer_name)
def unregister_producer(self, producer_name: Callable):
if not isinstance(producer_name, StatsProducer):
raise TypeError(f"Producer {producer_name} is not a StatsProducer")
raise TypeError(f'Producer {producer_name} is not a StatsProducer')
self.producers.remove(producer_name)

View File

@ -4,9 +4,8 @@ import queue
# aprsd.threads
from .aprsd import APRSDThread, APRSDThreadList # noqa: F401
from .rx import ( # noqa: F401
APRSDDupeRXThread,
APRSDProcessPacketThread,
APRSDRXThread,
)
packet_queue = queue.Queue(maxsize=20)
packet_queue = queue.Queue(maxsize=500)

View File

@ -7,36 +7,50 @@ import aprslib
from oslo_config import cfg
from aprsd import packets, plugin
from aprsd.client import client_factory
from aprsd.packets import collector
from aprsd.client.client import APRSDClient
from aprsd.packets import collector, filter
from aprsd.packets import log as packet_log
from aprsd.threads import APRSDThread, tx
from aprsd.utils import trace
CONF = cfg.CONF
LOG = logging.getLogger("APRSD")
LOG = logging.getLogger('APRSD')
class APRSDRXThread(APRSDThread):
"""Main Class to connect to an APRS Client and recieve packets.
A packet is received in the main loop and then sent to the
process_packet method, which sends the packet through the collector
to track the packet for stats, and then put into the packet queue
for processing in a separate thread.
"""
_client = None
# This is the queue that packets are sent to for processing.
# We process packets in a separate thread to help prevent
# getting blocked by the APRS server trying to send us packets.
packet_queue = None
pkt_count = 0
def __init__(self, packet_queue):
super().__init__("RX_PKT")
super().__init__('RX_PKT')
self.packet_queue = packet_queue
def stop(self):
self.thread_stop = True
if self._client:
self._client.stop()
self._client.close()
def loop(self):
if not self._client:
self._client = client_factory.create()
self._client = APRSDClient()
time.sleep(1)
return True
if not self._client.is_connected:
self._client = client_factory.create()
if not self._client.is_alive:
self._client = APRSDClient()
time.sleep(1)
return True
@ -52,62 +66,35 @@ class APRSDRXThread(APRSDThread):
# kwargs. :(
# https://github.com/rossengeorgiev/aprs-python/pull/56
self._client.consumer(
self._process_packet,
self.process_packet,
raw=False,
blocking=False,
)
except (
aprslib.exceptions.ConnectionDrop,
aprslib.exceptions.ConnectionError,
):
LOG.error("Connection dropped, reconnecting")
LOG.error('Connection dropped, reconnecting')
# Force the deletion of the client object connected to aprs
# This will cause a reconnect, next time client.get_client()
# is called
self._client.reset()
time.sleep(5)
except Exception:
# LOG.exception(ex)
LOG.error("Resetting connection and trying again.")
except Exception as ex:
LOG.exception(ex)
LOG.error('Resetting connection and trying again.')
self._client.reset()
time.sleep(5)
# Continue to loop
time.sleep(1)
return True
def _process_packet(self, *args, **kwargs):
"""Intermediate callback so we can update the keepalive time."""
# Now call the 'real' packet processing for a RX'x packet
self.process_packet(*args, **kwargs)
@abc.abstractmethod
def process_packet(self, *args, **kwargs):
pass
class APRSDDupeRXThread(APRSDRXThread):
"""Process received packets.
This is the main APRSD Server command thread that
receives packets and makes sure the packet
hasn't been seen previously before sending it on
to be processed.
"""
@trace.trace
def process_packet(self, *args, **kwargs):
"""This handles the processing of an inbound packet.
When a packet is received by the connected client object,
it sends the raw packet into this function. This function then
decodes the packet via the client, and then processes the packet.
Ack Packets are sent to the PluginProcessPacketThread for processing.
All other packets have to be checked as a dupe, and then only after
we haven't seen this packet before, do we send it to the
PluginProcessPacketThread for processing.
"""
packet = self._client.decode_packet(*args, **kwargs)
packet_log.log(packet)
if not packet:
LOG.error(
'No packet received from decode_packet. Most likely a failure to parse'
)
return
self.pkt_count += 1
packet_log.log(packet, packet_count=self.pkt_count)
pkt_list = packets.PacketList()
if isinstance(packet, packets.AckPacket):
@ -140,26 +127,55 @@ class APRSDDupeRXThread(APRSDRXThread):
# If the packet came in within N seconds of the
# Last time seeing the packet, then we drop it as a dupe.
LOG.warning(
f"Packet {packet.from_call}:{packet.msgNo} already tracked, dropping."
f'Packet {packet.from_call}:{packet.msgNo} already tracked, dropping.'
)
else:
LOG.warning(
f"Packet {packet.from_call}:{packet.msgNo} already tracked "
f"but older than {CONF.packet_dupe_timeout} seconds. processing.",
f'Packet {packet.from_call}:{packet.msgNo} already tracked '
f'but older than {CONF.packet_dupe_timeout} seconds. processing.',
)
collector.PacketCollector().rx(packet)
self.packet_queue.put(packet)
class APRSDPluginRXThread(APRSDDupeRXThread):
""" "Process received packets.
class APRSDFilterThread(APRSDThread):
def __init__(self, thread_name, packet_queue):
super().__init__(thread_name)
self.packet_queue = packet_queue
For backwards compatibility, we keep the APRSDPluginRXThread.
"""
def filter_packet(self, packet):
# Do any packet filtering prior to processing
if not filter.PacketFilter().filter(packet):
return None
return packet
def print_packet(self, packet):
"""Allow a child of this class to override this.
This is helpful if for whatever reason the child class
doesn't want to log packets.
"""
packet_log.log(packet)
def loop(self):
try:
packet = self.packet_queue.get(timeout=1)
self.print_packet(packet)
if packet:
if self.filter_packet(packet):
self.process_packet(packet)
except queue.Empty:
pass
return True
class APRSDProcessPacketThread(APRSDThread):
"""Base class for processing received packets.
class APRSDProcessPacketThread(APRSDFilterThread):
"""Base class for processing received packets after they have been filtered.
Packets are received from the client, then filtered for dupes,
then sent to the packet queue. This thread pulls packets from
the packet queue for processing.
This is the base class for processing packets coming from
the consumer. This base class handles sending ack packets and
@ -167,48 +183,42 @@ class APRSDProcessPacketThread(APRSDThread):
for processing."""
def __init__(self, packet_queue):
self.packet_queue = packet_queue
super().__init__("ProcessPKT")
super().__init__('ProcessPKT', packet_queue=packet_queue)
if not CONF.enable_sending_ack_packets:
LOG.warning(
"Sending ack packets is disabled, messages "
"will not be acknowledged.",
'Sending ack packets is disabled, messages will not be acknowledged.',
)
def process_ack_packet(self, packet):
"""We got an ack for a message, no need to resend it."""
ack_num = packet.msgNo
LOG.debug(f"Got ack for message {ack_num}")
LOG.debug(f'Got ack for message {ack_num}')
collector.PacketCollector().rx(packet)
def process_piggyback_ack(self, packet):
"""We got an ack embedded in a packet."""
ack_num = packet.ackMsgNo
LOG.debug(f"Got PiggyBackAck for message {ack_num}")
LOG.debug(f'Got PiggyBackAck for message {ack_num}')
collector.PacketCollector().rx(packet)
def process_reject_packet(self, packet):
"""We got a reject message for a packet. Stop sending the message."""
ack_num = packet.msgNo
LOG.debug(f"Got REJECT for message {ack_num}")
LOG.debug(f'Got REJECT for message {ack_num}')
collector.PacketCollector().rx(packet)
def loop(self):
try:
packet = self.packet_queue.get(timeout=1)
if packet:
self.process_packet(packet)
except queue.Empty:
pass
return True
def process_packet(self, packet):
"""Process a packet received from aprs-is server."""
LOG.debug(f"ProcessPKT-LOOP {self.loop_count}")
LOG.debug(f'ProcessPKT-LOOP {self.loop_count}')
# set this now as we are going to process it.
# This is used during dupe checking, so set it early
packet.processed = True
our_call = CONF.callsign.lower()
from_call = packet.from_call
if packet.addresse:
if hasattr(packet, 'addresse') and packet.addresse:
to_call = packet.addresse
else:
to_call = packet.to_call
@ -227,7 +237,7 @@ class APRSDProcessPacketThread(APRSDThread):
):
self.process_reject_packet(packet)
else:
if hasattr(packet, "ackMsgNo") and packet.ackMsgNo:
if hasattr(packet, 'ackMsgNo') and packet.ackMsgNo:
# we got an ack embedded in this packet
# we need to handle the ack
self.process_piggyback_ack(packet)
@ -267,7 +277,7 @@ class APRSDProcessPacketThread(APRSDThread):
if not for_us:
LOG.info("Got a packet meant for someone else '{packet.to_call}'")
else:
LOG.info("Got a non AckPacket/MessagePacket")
LOG.info('Got a non AckPacket/MessagePacket')
class APRSDPluginProcessPacketThread(APRSDProcessPacketThread):
@ -287,7 +297,7 @@ class APRSDPluginProcessPacketThread(APRSDProcessPacketThread):
tx.send(subreply)
else:
wl = CONF.watch_list
to_call = wl["alert_callsign"]
to_call = wl['alert_callsign']
tx.send(
packets.MessagePacket(
from_call=CONF.callsign,
@ -299,7 +309,7 @@ class APRSDPluginProcessPacketThread(APRSDProcessPacketThread):
# We have a message based object.
tx.send(reply)
except Exception as ex:
LOG.error("Plugin failed!!!")
LOG.error('Plugin failed!!!')
LOG.exception(ex)
def process_our_message_packet(self, packet):
@ -355,11 +365,11 @@ class APRSDPluginProcessPacketThread(APRSDProcessPacketThread):
if to_call == CONF.callsign and not replied:
# Tailor the messages accordingly
if CONF.load_help_plugin:
LOG.warning("Sending help!")
LOG.warning('Sending help!')
message_text = "Unknown command! Send 'help' message for help"
else:
LOG.warning("Unknown command!")
message_text = "Unknown command!"
LOG.warning('Unknown command!')
message_text = 'Unknown command!'
tx.send(
packets.MessagePacket(
@ -369,11 +379,11 @@ class APRSDPluginProcessPacketThread(APRSDProcessPacketThread):
),
)
except Exception as ex:
LOG.error("Plugin failed!!!")
LOG.error('Plugin failed!!!')
LOG.exception(ex)
# Do we need to send a reply?
if to_call == CONF.callsign:
reply = "A Plugin failed! try again?"
reply = 'A Plugin failed! try again?'
tx.send(
packets.MessagePacket(
from_call=CONF.callsign,
@ -382,4 +392,4 @@ class APRSDPluginProcessPacketThread(APRSDProcessPacketThread):
),
)
LOG.debug("Completed process_our_message_packet")
LOG.debug('Completed process_our_message_packet')

42
aprsd/threads/service.py Normal file
View File

@ -0,0 +1,42 @@
# aprsd/aprsd/threads/service.py
#
# This module is used to register threads that the service command runs.
#
# The service command is used to start and stop the APRS service.
# This is a mechanism to register threads that the service or command
# needs to run, and then start stop them as needed.
from aprsd.threads import aprsd as aprsd_threads
from aprsd.utils import singleton
@singleton
class ServiceThreads:
"""Registry for threads that the service command runs.
This enables extensions to register a thread to run during
the service command.
"""
def __init__(self):
self.threads: list[aprsd_threads.APRSDThread] = []
def register(self, thread: aprsd_threads.APRSDThread):
if not isinstance(thread, aprsd_threads.APRSDThread):
raise TypeError(f'Thread {thread} is not an APRSDThread')
self.threads.append(thread)
def unregister(self, thread: aprsd_threads.APRSDThread):
if not isinstance(thread, aprsd_threads.APRSDThread):
raise TypeError(f'Thread {thread} is not an APRSDThread')
self.threads.remove(thread)
def start(self):
"""Start all threads in the list."""
for thread in self.threads:
thread.start()
def join(self):
"""Join all the threads in the list"""
for thread in self.threads:
thread.join()

View File

@ -1,8 +1,6 @@
import logging
import threading
import time
import wrapt
from oslo_config import cfg
from aprsd.stats import collector
@ -10,18 +8,15 @@ from aprsd.threads import APRSDThread
from aprsd.utils import objectstore
CONF = cfg.CONF
LOG = logging.getLogger("APRSD")
LOG = logging.getLogger('APRSD')
class StatsStore(objectstore.ObjectStoreMixin):
"""Container to save the stats from the collector."""
lock = threading.Lock()
data = {}
@wrapt.synchronized(lock)
def add(self, stats: dict):
self.data = stats
with self.lock:
self.data = stats
class APRSDStatsStoreThread(APRSDThread):
@ -31,7 +26,7 @@ class APRSDStatsStoreThread(APRSDThread):
save_interval = 10
def __init__(self):
super().__init__("StatsStore")
super().__init__('StatsStore')
def loop(self):
if self.loop_count % self.save_interval == 0:

View File

@ -11,12 +11,12 @@ from rush.stores import dictionary
from aprsd import conf # noqa
from aprsd import threads as aprsd_threads
from aprsd.client import client_factory
from aprsd.client.client import APRSDClient
from aprsd.packets import collector, core, tracker
from aprsd.packets import log as packet_log
CONF = cfg.CONF
LOG = logging.getLogger("APRSD")
LOG = logging.getLogger('APRSD')
msg_t = throttle.Throttle(
limiter=periodic.PeriodicLimiter(
@ -54,7 +54,7 @@ def send(packet: core.Packet, direct=False, aprs_client=None):
if CONF.enable_sending_ack_packets:
_send_ack(packet, direct=direct, aprs_client=aprs_client)
else:
LOG.info("Sending ack packets is disabled. Not sending AckPacket.")
LOG.info('Sending ack packets is disabled. Not sending AckPacket.')
else:
_send_packet(packet, direct=direct, aprs_client=aprs_client)
@ -81,14 +81,14 @@ def _send_direct(packet, aprs_client=None):
if aprs_client:
cl = aprs_client
else:
cl = client_factory.create()
cl = APRSDClient()
packet.update_timestamp()
packet_log.log(packet, tx=True)
try:
cl.send(packet)
except Exception as e:
LOG.error(f"Failed to send packet: {packet}")
LOG.error(f'Failed to send packet: {packet}')
LOG.error(e)
return False
else:
@ -100,7 +100,7 @@ class SendPacketThread(aprsd_threads.APRSDThread):
def __init__(self, packet):
self.packet = packet
super().__init__(f"TX-{packet.to_call}-{self.packet.msgNo}")
super().__init__(f'TX-{packet.to_call}-{self.packet.msgNo}')
def loop(self):
"""Loop until a message is acked or it gets delayed.
@ -119,9 +119,9 @@ class SendPacketThread(aprsd_threads.APRSDThread):
# The message has been removed from the tracking queue
# So it got acked and we are done.
LOG.info(
f"{self.packet.__class__.__name__}"
f"({self.packet.msgNo}) "
"Message Send Complete via Ack.",
f'{self.packet.__class__.__name__}'
f'({self.packet.msgNo}) '
'Message Send Complete via Ack.',
)
return False
else:
@ -130,10 +130,10 @@ class SendPacketThread(aprsd_threads.APRSDThread):
# we reached the send limit, don't send again
# TODO(hemna) - Need to put this in a delayed queue?
LOG.info(
f"{packet.__class__.__name__} "
f"({packet.msgNo}) "
"Message Send Complete. Max attempts reached"
f" {packet.retry_count}",
f'{packet.__class__.__name__} '
f'({packet.msgNo}) '
'Message Send Complete. Max attempts reached'
f' {packet.retry_count}',
)
pkt_tracker.remove(packet.msgNo)
return False
@ -157,8 +157,9 @@ class SendPacketThread(aprsd_threads.APRSDThread):
sent = False
try:
sent = _send_direct(packet)
except Exception:
LOG.error(f"Failed to send packet: {packet}")
except Exception as ex:
LOG.error(f'Failed to send packet: {packet}')
LOG.error(ex)
else:
# If an exception happens while sending
# we don't want this attempt to count
@ -178,7 +179,7 @@ class SendAckThread(aprsd_threads.APRSDThread):
def __init__(self, packet):
self.packet = packet
super().__init__(f"TXAck-{packet.to_call}-{self.packet.msgNo}")
super().__init__(f'TXAck-{packet.to_call}-{self.packet.msgNo}')
self.max_retries = CONF.default_ack_send_count
def loop(self):
@ -188,10 +189,10 @@ class SendAckThread(aprsd_threads.APRSDThread):
# we reached the send limit, don't send again
# TODO(hemna) - Need to put this in a delayed queue?
LOG.debug(
f"{self.packet.__class__.__name__}"
f"({self.packet.msgNo}) "
"Send Complete. Max attempts reached"
f" {self.max_retries}",
f'{self.packet.__class__.__name__}'
f'({self.packet.msgNo}) '
'Send Complete. Max attempts reached'
f' {self.max_retries}',
)
return False
@ -207,7 +208,7 @@ class SendAckThread(aprsd_threads.APRSDThread):
# It's time to try to send it again
send_now = True
elif self.loop_count % 10 == 0:
LOG.debug(f"Still wating. {delta}")
LOG.debug(f'Still wating. {delta}')
else:
send_now = True
@ -216,7 +217,7 @@ class SendAckThread(aprsd_threads.APRSDThread):
try:
sent = _send_direct(self.packet)
except Exception:
LOG.error(f"Failed to send packet: {self.packet}")
LOG.error(f'Failed to send packet: {self.packet}')
else:
# If an exception happens while sending
# we don't want this attempt to count
@ -240,18 +241,18 @@ class BeaconSendThread(aprsd_threads.APRSDThread):
_loop_cnt: int = 1
def __init__(self):
super().__init__("BeaconSendThread")
super().__init__('BeaconSendThread')
self._loop_cnt = 1
# Make sure Latitude and Longitude are set.
if not CONF.latitude or not CONF.longitude:
LOG.error(
"Latitude and Longitude are not set in the config file."
"Beacon will not be sent and thread is STOPPED.",
'Latitude and Longitude are not set in the config file.'
'Beacon will not be sent and thread is STOPPED.',
)
self.stop()
LOG.info(
"Beacon thread is running and will send "
f"beacons every {CONF.beacon_interval} seconds.",
'Beacon thread is running and will send '
f'beacons every {CONF.beacon_interval} seconds.',
)
def loop(self):
@ -259,10 +260,10 @@ class BeaconSendThread(aprsd_threads.APRSDThread):
if self._loop_cnt % CONF.beacon_interval == 0:
pkt = core.BeaconPacket(
from_call=CONF.callsign,
to_call="APRS",
to_call='APRS',
latitude=float(CONF.latitude),
longitude=float(CONF.longitude),
comment="APRSD GPS Beacon",
comment='APRSD GPS Beacon',
symbol=CONF.beacon_symbol,
)
try:
@ -270,8 +271,8 @@ class BeaconSendThread(aprsd_threads.APRSDThread):
pkt.retry_count = 1
send(pkt, direct=True)
except Exception as e:
LOG.error(f"Failed to send beacon: {e}")
client_factory.create().reset()
LOG.error(f'Failed to send beacon: {e}')
APRSDClient().reset()
time.sleep(5)
self._loop_cnt += 1

View File

@ -3,7 +3,7 @@ from typing import Callable, Protocol, runtime_checkable
from aprsd.utils import singleton
LOG = logging.getLogger("APRSD")
LOG = logging.getLogger('APRSD')
@runtime_checkable
@ -33,7 +33,8 @@ class KeepAliveCollector:
try:
cls.keepalive_check()
except Exception as e:
LOG.error(f"Error in producer {name} (check): {e}")
LOG.error(f'Error in producer {name} (check): {e}')
raise e
def log(self) -> None:
"""Log any relevant information during a KeepAlive check"""
@ -42,14 +43,15 @@ class KeepAliveCollector:
try:
cls.keepalive_log()
except Exception as e:
LOG.error(f"Error in producer {name} (check): {e}")
LOG.error(f'Error in producer {name} (check): {e}')
raise e
def register(self, producer_name: Callable):
if not isinstance(producer_name, KeepAliveProducer):
raise TypeError(f"Producer {producer_name} is not a KeepAliveProducer")
raise TypeError(f'Producer {producer_name} is not a KeepAliveProducer')
self.producers.append(producer_name)
def unregister(self, producer_name: Callable):
if not isinstance(producer_name, KeepAliveProducer):
raise TypeError(f"Producer {producer_name} is not a KeepAliveProducer")
raise TypeError(f'Producer {producer_name} is not a KeepAliveProducer')
self.producers.remove(producer_name)

View File

@ -6,9 +6,8 @@ import threading
from oslo_config import cfg
CONF = cfg.CONF
LOG = logging.getLogger("APRSD")
LOG = logging.getLogger('APRSD')
class ObjectStoreMixin:
@ -63,7 +62,7 @@ class ObjectStoreMixin:
def _save_filename(self):
save_location = CONF.save_location
return "{}/{}.p".format(
return '{}/{}.p'.format(
save_location,
self.__class__.__name__.lower(),
)
@ -75,13 +74,13 @@ class ObjectStoreMixin:
self._init_store()
save_filename = self._save_filename()
if len(self) > 0:
LOG.info(
f"{self.__class__.__name__}::Saving"
f" {len(self)} entries to disk at "
f"{save_filename}",
LOG.debug(
f'{self.__class__.__name__}::Saving'
f' {len(self)} entries to disk at '
f'{save_filename}',
)
with self.lock:
with open(save_filename, "wb+") as fp:
with open(save_filename, 'wb+') as fp:
pickle.dump(self.data, fp)
else:
LOG.debug(
@ -97,21 +96,21 @@ class ObjectStoreMixin:
return
if os.path.exists(self._save_filename()):
try:
with open(self._save_filename(), "rb") as fp:
with open(self._save_filename(), 'rb') as fp:
raw = pickle.load(fp)
if raw:
self.data = raw
LOG.debug(
f"{self.__class__.__name__}::Loaded {len(self)} entries from disk.",
f'{self.__class__.__name__}::Loaded {len(self)} entries from disk.',
)
else:
LOG.debug(f"{self.__class__.__name__}::No data to load.")
LOG.debug(f'{self.__class__.__name__}::No data to load.')
except (pickle.UnpicklingError, Exception) as ex:
LOG.error(f"Failed to UnPickle {self._save_filename()}")
LOG.error(f'Failed to UnPickle {self._save_filename()}')
LOG.error(ex)
self.data = {}
else:
LOG.debug(f"{self.__class__.__name__}::No save file found.")
LOG.debug(f'{self.__class__.__name__}::No save file found.')
def flush(self):
"""Nuke the old pickle file that stored the old results from last aprsd run."""

View File

@ -1,14 +1,27 @@
version: "3"
services:
aprsd:
image: hemna6969/aprsd:latest
container_name: aprsd
ports:
- "8001:8001"
volumes:
- $HOME/.config/aprsd:/config
restart: unless-stopped
environment:
- TZ=America/New_York
- APRSD_PLUGINS=aprsd-slack-plugin>=1.0.2
- LOG_LEVEL=ERROR
aprsd:
image: hemna6969/aprsd:latest
container_name: aprsd-server
volumes:
- $HOME/.config/aprsd/:/config # left side of the : is your directory where your config is
# outside of your container. Your normal filesystem.
restart: unless-stopped
environment:
- TZ=America/New_York
- APRSD_PLUGINS=aprsd-email-plugin,aprsd-borat-plugin
- LOG_LEVEL=ERROR
aprsd-admin: # Admin interface
image: hemna6969/aprsd:latest
container_name: aprsd-admin
volumes:
- $HOME/.config/aprsd/:/config # left side of the : is your directory where your config is
# outside of your container. Your normal filesystem.
restart: unless-stopped
ports:
- 8001:8001 # left side of the : is your port on your host that you can access
# the web interface for the admin interface.
entrypoint: /app/admin.sh
environment:
- TZ=America/New_York
- APRSD_EXTENSIONS=git+https://github.com/hemna/aprsd-admin-extension.git

View File

@ -1,10 +1,10 @@
# This file was autogenerated by uv via the following command:
# uv pip compile --resolver backtracking --annotation-style=line requirements-dev.in -o requirements-dev.txt
alabaster==1.0.0 # via sphinx
babel==2.16.0 # via sphinx
babel==2.17.0 # via sphinx
build==1.2.2.post1 # via pip-tools, -r requirements-dev.in
cachetools==5.5.1 # via tox
certifi==2024.12.14 # via requests
cachetools==5.5.2 # via tox
certifi==2025.1.31 # via requests
cfgv==3.4.0 # via pre-commit
chardet==5.2.0 # via tox
charset-normalizer==3.4.1 # via requests
@ -12,38 +12,37 @@ click==8.1.8 # via pip-tools
colorama==0.4.6 # via tox
distlib==0.3.9 # via virtualenv
docutils==0.21.2 # via m2r, sphinx
filelock==3.17.0 # via tox, virtualenv
identify==2.6.6 # via pre-commit
filelock==3.18.0 # via tox, virtualenv
identify==2.6.10 # via pre-commit
idna==3.10 # via requests
imagesize==1.4.1 # via sphinx
jinja2==3.1.5 # via sphinx
jinja2==3.1.6 # via sphinx
m2r==0.3.1 # via -r requirements-dev.in
markupsafe==3.0.2 # via jinja2
mistune==0.8.4 # via m2r
nodeenv==1.9.1 # via pre-commit
packaging==24.2 # via build, pyproject-api, sphinx, tox
pip==24.3.1 # via pip-tools, -r requirements-dev.in
packaging==25.0 # via build, pyproject-api, sphinx, tox
pip==25.0.1 # via pip-tools, -r requirements-dev.in
pip-tools==7.4.1 # via -r requirements-dev.in
platformdirs==4.3.6 # via tox, virtualenv
platformdirs==4.3.7 # via tox, virtualenv
pluggy==1.5.0 # via tox
pre-commit==4.1.0 # via -r requirements-dev.in
pre-commit==4.2.0 # via -r requirements-dev.in
pygments==2.19.1 # via sphinx
pyproject-api==1.9.0 # via tox
pyproject-hooks==1.2.0 # via build, pip-tools
pyyaml==6.0.2 # via pre-commit
requests==2.32.3 # via sphinx
setuptools==75.8.0 # via pip-tools
roman-numerals-py==3.1.0 # via sphinx
setuptools==79.0.1 # via pip-tools
snowballstemmer==2.2.0 # via sphinx
sphinx==8.1.3 # via -r requirements-dev.in
sphinx==8.2.3 # via -r requirements-dev.in
sphinxcontrib-applehelp==2.0.0 # via sphinx
sphinxcontrib-devhelp==2.0.0 # via sphinx
sphinxcontrib-htmlhelp==2.1.0 # via sphinx
sphinxcontrib-jsmath==1.0.1 # via sphinx
sphinxcontrib-qthelp==2.0.0 # via sphinx
sphinxcontrib-serializinghtml==2.0.0 # via sphinx
tomli==2.2.1 # via build, pip-tools, pyproject-api, sphinx, tox
tox==4.24.1 # via -r requirements-dev.in
typing-extensions==4.12.2 # via tox
urllib3==2.3.0 # via requests
virtualenv==20.29.1 # via pre-commit, tox
tox==4.25.0 # via -r requirements-dev.in
urllib3==2.4.0 # via requests
virtualenv==20.30.0 # via pre-commit, tox
wheel==0.45.1 # via pip-tools, -r requirements-dev.in

View File

@ -7,8 +7,7 @@ loguru
oslo.config
pluggy
requests
# Pinned due to gray needing 12.6.0
rich~=12.6.0
rich
rush
thesmuggler
tzlocal

View File

@ -1,13 +1,12 @@
# This file was autogenerated by uv via the following command:
# uv pip compile --resolver backtracking --annotation-style=line requirements.in -o requirements.txt
aprslib==0.7.2 # via -r requirements.in
attrs==24.3.0 # via ax253, kiss3, rush
attrs==25.3.0 # via ax253, kiss3, rush
ax253==0.1.5.post1 # via kiss3
bitarray==3.0.0 # via ax253, kiss3
certifi==2024.12.14 # via requests
bitarray==3.3.1 # via ax253, kiss3
certifi==2025.1.31 # via requests
charset-normalizer==3.4.1 # via requests
click==8.1.8 # via -r requirements.in
commonmark==0.9.1 # via rich
dataclasses-json==0.6.7 # via -r requirements.in
debtcollector==3.0.0 # via oslo-config
haversine==2.9.0 # via -r requirements.in
@ -15,30 +14,33 @@ idna==3.10 # via requests
importlib-metadata==8.6.1 # via ax253, kiss3
kiss3==8.0.0 # via -r requirements.in
loguru==0.7.3 # via -r requirements.in
marshmallow==3.26.0 # via dataclasses-json
mypy-extensions==1.0.0 # via typing-inspect
markdown-it-py==3.0.0 # via rich
marshmallow==3.26.1 # via dataclasses-json
mdurl==0.1.2 # via markdown-it-py
mypy-extensions==1.1.0 # via typing-inspect
netaddr==1.3.0 # via oslo-config
oslo-config==9.7.0 # via -r requirements.in
oslo-i18n==6.5.0 # via oslo-config
packaging==24.2 # via marshmallow
pbr==6.1.0 # via oslo-i18n, stevedore
oslo-config==9.7.1 # via -r requirements.in
oslo-i18n==6.5.1 # via oslo-config
packaging==25.0 # via marshmallow
pbr==6.1.1 # via oslo-i18n, stevedore
pluggy==1.5.0 # via -r requirements.in
pygments==2.19.1 # via rich
pyserial==3.5 # via pyserial-asyncio
pyserial-asyncio==0.6 # via kiss3
pytz==2024.2 # via -r requirements.in
pytz==2025.2 # via -r requirements.in
pyyaml==6.0.2 # via oslo-config
requests==2.32.3 # via oslo-config, update-checker, -r requirements.in
rfc3986==2.0.0 # via oslo-config
rich==12.6.0 # via -r requirements.in
rich==14.0.0 # via -r requirements.in
rush==2021.4.0 # via -r requirements.in
stevedore==5.4.0 # via oslo-config
setuptools==79.0.1 # via pbr
stevedore==5.4.1 # via oslo-config
thesmuggler==1.0.1 # via -r requirements.in
timeago==1.0.16 # via -r requirements.in
typing-extensions==4.12.2 # via typing-inspect
typing-extensions==4.13.2 # via typing-inspect
typing-inspect==0.9.0 # via dataclasses-json
tzlocal==5.2 # via -r requirements.in
tzlocal==5.3.1 # via -r requirements.in
update-checker==0.18.0 # via -r requirements.in
urllib3==2.3.0 # via requests
urllib3==2.4.0 # via requests
wrapt==1.17.2 # via debtcollector, -r requirements.in
zipp==3.21.0 # via importlib-metadata

0
tests/client/__init__.py Normal file
View File

View File

View File

@ -0,0 +1,440 @@
import datetime
import unittest
from unittest import mock
from aprslib.exceptions import LoginError
from aprsd import exception
from aprsd.client.drivers.aprsis import APRSISDriver
from aprsd.client.drivers.registry import ClientDriver
from aprsd.packets import core
class TestAPRSISDriver(unittest.TestCase):
"""Unit tests for the APRSISDriver class."""
def setUp(self):
# Mock configuration
self.conf_patcher = mock.patch('aprsd.client.drivers.aprsis.CONF')
self.mock_conf = self.conf_patcher.start()
# Configure APRS-IS settings
self.mock_conf.aprs_network.enabled = True
self.mock_conf.aprs_network.login = 'TEST'
self.mock_conf.aprs_network.password = '12345'
self.mock_conf.aprs_network.host = 'rotate.aprs.net'
self.mock_conf.aprs_network.port = 14580
# Mock APRS Lib Client
self.aprslib_patcher = mock.patch('aprsd.client.drivers.aprsis.APRSLibClient')
self.mock_aprslib = self.aprslib_patcher.start()
self.mock_client = mock.MagicMock()
self.mock_aprslib.return_value = self.mock_client
# Create an instance of the driver
self.driver = APRSISDriver()
def tearDown(self):
self.conf_patcher.stop()
self.aprslib_patcher.stop()
def test_implements_client_driver_protocol(self):
"""Test that APRSISDriver implements the ClientDriver Protocol."""
# Verify the instance is recognized as implementing the Protocol
self.assertIsInstance(self.driver, ClientDriver)
# Verify all required methods are present with correct signatures
required_methods = [
'is_enabled',
'is_configured',
'is_alive',
'close',
'send',
'setup_connection',
'set_filter',
'login_success',
'login_failure',
'consumer',
'decode_packet',
'stats',
]
for method_name in required_methods:
self.assertTrue(
hasattr(self.driver, method_name),
f'Missing required method: {method_name}',
)
def test_init(self):
"""Test initialization sets default values."""
self.assertIsInstance(self.driver.max_delta, datetime.timedelta)
self.assertEqual(self.driver.max_delta, datetime.timedelta(minutes=2))
self.assertFalse(self.driver.login_status['success'])
self.assertIsNone(self.driver.login_status['message'])
self.assertIsNone(self.driver._client)
def test_is_enabled_true(self):
"""Test is_enabled returns True when APRS-IS is enabled."""
self.mock_conf.aprs_network.enabled = True
self.assertTrue(APRSISDriver.is_enabled())
def test_is_enabled_false(self):
"""Test is_enabled returns False when APRS-IS is disabled."""
self.mock_conf.aprs_network.enabled = False
self.assertFalse(APRSISDriver.is_enabled())
def test_is_enabled_key_error(self):
"""Test is_enabled returns False when enabled flag doesn't exist."""
self.mock_conf.aprs_network = mock.MagicMock()
type(self.mock_conf.aprs_network).enabled = mock.PropertyMock(
side_effect=KeyError
)
self.assertFalse(APRSISDriver.is_enabled())
def test_is_configured_true(self):
"""Test is_configured returns True when properly configured."""
with mock.patch.object(APRSISDriver, 'is_enabled', return_value=True):
self.mock_conf.aprs_network.login = 'TEST'
self.mock_conf.aprs_network.password = '12345'
self.mock_conf.aprs_network.host = 'rotate.aprs.net'
self.assertTrue(APRSISDriver.is_configured())
def test_is_configured_no_login(self):
"""Test is_configured raises exception when login not set."""
with mock.patch.object(APRSISDriver, 'is_enabled', return_value=True):
self.mock_conf.aprs_network.login = None
with self.assertRaises(exception.MissingConfigOptionException):
APRSISDriver.is_configured()
def test_is_configured_no_password(self):
"""Test is_configured raises exception when password not set."""
with mock.patch.object(APRSISDriver, 'is_enabled', return_value=True):
self.mock_conf.aprs_network.login = 'TEST'
self.mock_conf.aprs_network.password = None
with self.assertRaises(exception.MissingConfigOptionException):
APRSISDriver.is_configured()
def test_is_configured_no_host(self):
"""Test is_configured raises exception when host not set."""
with mock.patch.object(APRSISDriver, 'is_enabled', return_value=True):
self.mock_conf.aprs_network.login = 'TEST'
self.mock_conf.aprs_network.password = '12345'
self.mock_conf.aprs_network.host = None
with self.assertRaises(exception.MissingConfigOptionException):
APRSISDriver.is_configured()
def test_is_configured_disabled(self):
"""Test is_configured returns True when not enabled."""
with mock.patch.object(APRSISDriver, 'is_enabled', return_value=False):
self.assertTrue(APRSISDriver.is_configured())
def test_is_alive_no_client(self):
"""Test is_alive returns False when no client."""
self.driver._client = None
self.assertFalse(self.driver.is_alive)
def test_is_alive_true(self):
"""Test is_alive returns True when client is alive and connection is not stale."""
self.driver._client = self.mock_client
self.mock_client.is_alive.return_value = True
with mock.patch.object(self.driver, '_is_stale_connection', return_value=False):
self.assertTrue(self.driver.is_alive)
def test_is_alive_client_not_alive(self):
"""Test is_alive returns False when client is not alive."""
self.driver._client = self.mock_client
self.mock_client.is_alive.return_value = False
with mock.patch.object(self.driver, '_is_stale_connection', return_value=False):
self.assertFalse(self.driver.is_alive)
def test_is_alive_stale_connection(self):
"""Test is_alive returns False when connection is stale."""
self.driver._client = self.mock_client
self.mock_client.is_alive.return_value = True
with mock.patch.object(self.driver, '_is_stale_connection', return_value=True):
self.assertFalse(self.driver.is_alive)
def test_close(self):
"""Test close method stops and closes the client."""
self.driver._client = self.mock_client
self.driver.close()
self.mock_client.stop.assert_called_once()
self.mock_client.close.assert_called_once()
def test_close_no_client(self):
"""Test close method handles no client gracefully."""
self.driver._client = None
# Should not raise exception
self.driver.close()
def test_send(self):
"""Test send passes packet to client."""
self.driver._client = self.mock_client
mock_packet = mock.MagicMock(spec=core.Packet)
self.driver.send(mock_packet)
self.mock_client.send.assert_called_once_with(mock_packet)
@mock.patch('aprsd.client.drivers.aprsis.LOG')
def test_setup_connection_success(self, mock_log):
"""Test setup_connection successfully connects."""
# Configure successful connection
self.mock_client.server_string = 'Test APRS-IS Server'
self.driver.setup_connection()
# Check client created with correct parameters
self.mock_aprslib.assert_called_once_with(
self.mock_conf.aprs_network.login,
passwd=self.mock_conf.aprs_network.password,
host=self.mock_conf.aprs_network.host,
port=self.mock_conf.aprs_network.port,
)
# Check logger set and connection initialized
self.assertEqual(self.mock_client.logger, mock_log)
self.mock_client.connect.assert_called_once()
# Check status updated
self.assertTrue(self.driver.connected)
self.assertTrue(self.driver.login_status['success'])
self.assertEqual(self.driver.login_status['message'], 'Test APRS-IS Server')
@mock.patch('aprsd.client.drivers.aprsis.LOG')
@mock.patch('aprsd.client.drivers.aprsis.time.sleep')
def test_setup_connection_login_error(self, mock_sleep, mock_log):
"""Test setup_connection handles login error."""
# Configure login error
login_error = LoginError('Bad login')
login_error.message = 'Invalid login credentials'
self.mock_client.connect.side_effect = login_error
self.driver.setup_connection()
# Check error logged
mock_log.error.assert_any_call("Failed to login to APRS-IS Server 'Bad login'")
mock_log.error.assert_any_call('Invalid login credentials')
# Check status updated
self.assertFalse(self.driver.connected)
self.assertFalse(self.driver.login_status['success'])
self.assertEqual(
self.driver.login_status['message'], 'Invalid login credentials'
)
# Check backoff used
mock_sleep.assert_called()
@mock.patch('aprsd.client.drivers.aprsis.LOG')
@mock.patch('aprsd.client.drivers.aprsis.time.sleep')
def test_setup_connection_general_error(self, mock_sleep, mock_log):
"""Test setup_connection handles general error."""
# Configure general exception
error_message = 'Connection error'
error = Exception(error_message)
# Standard exceptions don't have a message attribute
self.mock_client.connect.side_effect = error
self.driver.setup_connection()
# Check error logged
mock_log.error.assert_any_call(
f"Unable to connect to APRS-IS server. '{error_message}' "
)
# Check status updated
self.assertFalse(self.driver.connected)
self.assertFalse(self.driver.login_status['success'])
# Check login message contains the error message (more flexible than exact equality)
self.assertIn(error_message, self.driver.login_status['message'])
# Check backoff used
mock_sleep.assert_called()
def test_set_filter(self):
"""Test set_filter passes filter to client."""
self.driver._client = self.mock_client
test_filter = 'm/50'
self.driver.set_filter(test_filter)
self.mock_client.set_filter.assert_called_once_with(test_filter)
def test_login_success(self):
"""Test login_success returns login status."""
self.driver.login_status['success'] = True
self.assertTrue(self.driver.login_success())
self.driver.login_status['success'] = False
self.assertFalse(self.driver.login_success())
def test_login_failure(self):
"""Test login_failure returns error message."""
self.driver.login_status['message'] = None
self.assertIsNone(self.driver.login_failure())
self.driver.login_status['message'] = 'Test error'
self.assertEqual(self.driver.login_failure(), 'Test error')
def test_filter_property(self):
"""Test filter property returns client filter."""
self.driver._client = self.mock_client
test_filter = 'm/50'
self.mock_client.filter = test_filter
self.assertEqual(self.driver.filter, test_filter)
def test_server_string_property(self):
"""Test server_string property returns client server string."""
self.driver._client = self.mock_client
test_string = 'Test APRS-IS Server'
self.mock_client.server_string = test_string
self.assertEqual(self.driver.server_string, test_string)
def test_keepalive_property(self):
"""Test keepalive property returns client keepalive."""
self.driver._client = self.mock_client
test_time = datetime.datetime.now()
self.mock_client.aprsd_keepalive = test_time
self.assertEqual(self.driver.keepalive, test_time)
@mock.patch('aprsd.client.drivers.aprsis.LOG')
def test_is_stale_connection_true(self, mock_log):
"""Test _is_stale_connection returns True when connection is stale."""
self.driver._client = self.mock_client
# Set keepalive to 3 minutes ago (exceeds max_delta of 2 minutes)
self.mock_client.aprsd_keepalive = datetime.datetime.now() - datetime.timedelta(
minutes=3
)
result = self.driver._is_stale_connection()
self.assertTrue(result)
mock_log.error.assert_called_once()
def test_is_stale_connection_false(self):
"""Test _is_stale_connection returns False when connection is not stale."""
self.driver._client = self.mock_client
# Set keepalive to 1 minute ago (within max_delta of 2 minutes)
self.mock_client.aprsd_keepalive = datetime.datetime.now() - datetime.timedelta(
minutes=1
)
result = self.driver._is_stale_connection()
self.assertFalse(result)
def test_transport(self):
"""Test transport returns appropriate transport type."""
self.assertEqual(APRSISDriver.transport(), 'aprsis')
def test_decode_packet(self):
"""Test decode_packet uses core.factory."""
with mock.patch('aprsd.client.drivers.aprsis.core.factory') as mock_factory:
raw_packet = {'from': 'TEST', 'to': 'APRS'}
self.driver.decode_packet(raw_packet)
mock_factory.assert_called_once_with(raw_packet)
@mock.patch('aprsd.client.drivers.aprsis.LOG')
def test_consumer_success(self, mock_log):
"""Test consumer forwards callback to client."""
self.driver._client = self.mock_client
mock_callback = mock.MagicMock()
self.driver.consumer(mock_callback, raw=True)
self.mock_client.consumer.assert_called_once_with(
mock_callback, blocking=False, immortal=False, raw=True
)
@mock.patch('aprsd.client.drivers.aprsis.LOG')
def test_consumer_exception(self, mock_log):
"""Test consumer handles exceptions."""
self.driver._client = self.mock_client
mock_callback = mock.MagicMock()
test_error = Exception('Test error')
self.mock_client.consumer.side_effect = test_error
with self.assertRaises(Exception): # noqa: B017
self.driver.consumer(mock_callback)
mock_log.error.assert_called_with(test_error)
@mock.patch('aprsd.client.drivers.aprsis.LOG')
def test_consumer_no_client(self, mock_log):
"""Test consumer handles no client gracefully."""
self.driver._client = None
mock_callback = mock.MagicMock()
self.driver.consumer(mock_callback)
mock_log.warning.assert_called_once()
self.assertFalse(self.driver.connected)
def test_stats_configured_with_client(self):
"""Test stats returns correct data when configured with client."""
# Configure driver
with mock.patch.object(self.driver, 'is_configured', return_value=True):
self.driver._client = self.mock_client
self.mock_client.aprsd_keepalive = datetime.datetime.now()
self.mock_client.server_string = 'Test Server'
self.mock_client.filter = 'm/50'
stats = self.driver.stats()
self.assertEqual(stats['connected'], True)
self.assertEqual(stats['filter'], 'm/50')
self.assertEqual(stats['server_string'], 'Test Server')
self.assertEqual(stats['transport'], 'aprsis')
def test_stats_serializable(self):
"""Test stats with serializable=True converts datetime to ISO format."""
# Configure driver
with mock.patch.object(self.driver, 'is_configured', return_value=True):
self.driver._client = self.mock_client
test_time = datetime.datetime.now()
self.mock_client.aprsd_keepalive = test_time
stats = self.driver.stats(serializable=True)
# Check keepalive is a string in ISO format
self.assertIsInstance(stats['connection_keepalive'], str)
# Try parsing it to verify it's a valid ISO format
try:
datetime.datetime.fromisoformat(stats['connection_keepalive'])
except ValueError:
self.fail('keepalive is not in valid ISO format')
def test_stats_no_client(self):
"""Test stats with no client."""
with mock.patch.object(self.driver, 'is_configured', return_value=True):
self.driver._client = None
stats = self.driver.stats()
self.assertEqual(stats['connection_keepalive'], 'None')
self.assertEqual(stats['server_string'], 'None')
def test_stats_not_configured(self):
"""Test stats when not configured returns empty dict."""
with mock.patch.object(self.driver, 'is_configured', return_value=False):
stats = self.driver.stats()
self.assertEqual(stats, {})
if __name__ == '__main__':
unittest.main()

View File

@ -0,0 +1,191 @@
import unittest
from unittest import mock
from aprsd.client.drivers.fake import APRSDFakeDriver
from aprsd.packets import core
class TestAPRSDFakeDriver(unittest.TestCase):
"""Unit tests for the APRSDFakeDriver class."""
def setUp(self):
# Mock CONF for testing
self.conf_patcher = mock.patch('aprsd.client.drivers.fake.CONF')
self.mock_conf = self.conf_patcher.start()
# Configure fake_client.enabled
self.mock_conf.fake_client.enabled = True
# Create an instance of the driver
self.driver = APRSDFakeDriver()
def tearDown(self):
self.conf_patcher.stop()
def test_init(self):
"""Test initialization sets default values."""
self.assertEqual(self.driver.path, ['WIDE1-1', 'WIDE2-1'])
self.assertFalse(self.driver.thread_stop)
def test_is_enabled_true(self):
"""Test is_enabled returns True when configured."""
self.mock_conf.fake_client.enabled = True
self.assertTrue(APRSDFakeDriver.is_enabled())
def test_is_enabled_false(self):
"""Test is_enabled returns False when not configured."""
self.mock_conf.fake_client.enabled = False
self.assertFalse(APRSDFakeDriver.is_enabled())
def test_is_alive(self):
"""Test is_alive returns True when thread_stop is False."""
self.driver.thread_stop = False
self.assertTrue(self.driver.is_alive())
self.driver.thread_stop = True
self.assertFalse(self.driver.is_alive())
def test_close(self):
"""Test close sets thread_stop to True."""
self.driver.thread_stop = False
self.driver.close()
self.assertTrue(self.driver.thread_stop)
@mock.patch('aprsd.client.drivers.fake.LOG')
def test_setup_connection(self, mock_log):
"""Test setup_connection does nothing (it's fake)."""
self.driver.setup_connection()
# Method doesn't do anything, so just verify it doesn't crash
def test_set_filter(self):
"""Test set_filter method does nothing (it's fake)."""
# Just test it doesn't fail
self.driver.set_filter('test/filter')
def test_login_success(self):
"""Test login_success always returns True."""
self.assertTrue(self.driver.login_success())
def test_login_failure(self):
"""Test login_failure always returns None."""
self.assertIsNone(self.driver.login_failure())
@mock.patch('aprsd.client.drivers.fake.LOG')
def test_send_with_packet_object(self, mock_log):
"""Test send with a Packet object."""
mock_packet = mock.MagicMock(spec=core.Packet)
mock_packet.payload = 'Test payload'
mock_packet.to_call = 'TEST'
mock_packet.from_call = 'FAKE'
self.driver.send(mock_packet)
mock_log.info.assert_called_once()
mock_packet.prepare.assert_called_once()
@mock.patch('aprsd.client.drivers.fake.LOG')
def test_send_with_non_packet_object(self, mock_log):
"""Test send with a non-Packet object."""
# Create a mock message-like object
mock_msg = mock.MagicMock()
mock_msg.raw = 'Test'
mock_msg.msgNo = '123'
mock_msg.to_call = 'TEST'
mock_msg.from_call = 'FAKE'
self.driver.send(mock_msg)
mock_log.info.assert_called_once()
mock_log.debug.assert_called_once()
@mock.patch('aprsd.client.drivers.fake.LOG')
@mock.patch('aprsd.client.drivers.fake.time.sleep')
def test_consumer_with_raw_true(self, mock_sleep, mock_log):
"""Test consumer with raw=True."""
mock_callback = mock.MagicMock()
self.driver.consumer(mock_callback, raw=True)
# Verify callback was called with raw data
mock_callback.assert_called_once()
call_args = mock_callback.call_args[1]
self.assertIn('raw', call_args)
mock_sleep.assert_called_once_with(1)
@mock.patch('aprsd.client.drivers.fake.LOG')
@mock.patch('aprsd.client.drivers.fake.aprslib.parse')
@mock.patch('aprsd.client.drivers.fake.core.factory')
@mock.patch('aprsd.client.drivers.fake.time.sleep')
def test_consumer_with_raw_false(
self, mock_sleep, mock_factory, mock_parse, mock_log
):
"""Test consumer with raw=False."""
mock_callback = mock.MagicMock()
mock_packet = mock.MagicMock(spec=core.Packet)
mock_factory.return_value = mock_packet
self.driver.consumer(mock_callback, raw=False)
# Verify the packet was created and passed to callback
mock_parse.assert_called_once()
mock_factory.assert_called_once()
mock_callback.assert_called_once_with(packet=mock_packet)
mock_sleep.assert_called_once_with(1)
def test_consumer_updates_keepalive(self):
"""Test consumer updates keepalive timestamp."""
mock_callback = mock.MagicMock()
old_keepalive = self.driver.aprsd_keepalive
# Force a small delay to ensure timestamp changes
import time
time.sleep(0.01)
with mock.patch('aprsd.client.drivers.fake.time.sleep'):
self.driver.consumer(mock_callback)
self.assertNotEqual(old_keepalive, self.driver.aprsd_keepalive)
self.assertGreater(self.driver.aprsd_keepalive, old_keepalive)
def test_decode_packet_with_empty_kwargs(self):
"""Test decode_packet with empty kwargs."""
result = self.driver.decode_packet()
self.assertIsNone(result)
def test_decode_packet_with_packet(self):
"""Test decode_packet with packet in kwargs."""
mock_packet = mock.MagicMock(spec=core.Packet)
result = self.driver.decode_packet(packet=mock_packet)
self.assertEqual(result, mock_packet)
@mock.patch('aprsd.client.drivers.fake.aprslib.parse')
@mock.patch('aprsd.client.drivers.fake.core.factory')
def test_decode_packet_with_raw(self, mock_factory, mock_parse):
"""Test decode_packet with raw in kwargs."""
mock_packet = mock.MagicMock(spec=core.Packet)
mock_factory.return_value = mock_packet
raw_data = 'raw packet data'
result = self.driver.decode_packet(raw=raw_data)
mock_parse.assert_called_once_with(raw_data)
mock_factory.assert_called_once_with(mock_parse.return_value)
self.assertEqual(result, mock_packet)
def test_stats(self):
"""Test stats returns correct information."""
self.driver.thread_stop = False
result = self.driver.stats()
self.assertEqual(result['driver'], 'APRSDFakeDriver')
self.assertTrue(result['is_alive'])
# Test with serializable parameter
result_serializable = self.driver.stats(serializable=True)
self.assertEqual(result_serializable['driver'], 'APRSDFakeDriver')
self.assertTrue(result_serializable['is_alive'])
if __name__ == '__main__':
unittest.main()

View File

@ -0,0 +1,498 @@
import datetime
import socket
import unittest
from unittest import mock
import aprslib
from aprsd import exception
from aprsd.client.drivers.registry import ClientDriver
from aprsd.client.drivers.tcpkiss import TCPKISSDriver
from aprsd.packets import core
class TestTCPKISSDriver(unittest.TestCase):
"""Unit tests for the TCPKISSDriver class."""
def setUp(self):
# Mock configuration
self.conf_patcher = mock.patch('aprsd.client.drivers.tcpkiss.CONF')
self.mock_conf = self.conf_patcher.start()
# Configure KISS settings
self.mock_conf.kiss_tcp.enabled = True
self.mock_conf.kiss_tcp.host = '127.0.0.1'
self.mock_conf.kiss_tcp.port = 8001
self.mock_conf.kiss_tcp.path = ['WIDE1-1', 'WIDE2-1']
# Mock socket
self.socket_patcher = mock.patch('aprsd.client.drivers.tcpkiss.socket')
self.mock_socket_module = self.socket_patcher.start()
self.mock_socket = mock.MagicMock()
self.mock_socket_module.socket.return_value = self.mock_socket
# Mock select
self.select_patcher = mock.patch('aprsd.client.drivers.tcpkiss.select')
self.mock_select = self.select_patcher.start()
# Create an instance of the driver
self.driver = TCPKISSDriver()
def tearDown(self):
self.conf_patcher.stop()
self.socket_patcher.stop()
self.select_patcher.stop()
def test_implements_client_driver_protocol(self):
"""Test that TCPKISSDriver implements the ClientDriver Protocol."""
# Verify the instance is recognized as implementing the Protocol
self.assertIsInstance(self.driver, ClientDriver)
# Verify all required methods are present with correct signatures
required_methods = [
'is_enabled',
'is_configured',
'is_alive',
'close',
'send',
'setup_connection',
'set_filter',
'login_success',
'login_failure',
'consumer',
'decode_packet',
'stats',
]
for method_name in required_methods:
self.assertTrue(
hasattr(self.driver, method_name),
f'Missing required method: {method_name}',
)
def test_init(self):
"""Test initialization sets default values."""
self.assertFalse(self.driver._connected)
self.assertIsInstance(self.driver.keepalive, datetime.datetime)
self.assertFalse(self.driver._running)
def test_transport_property(self):
"""Test transport property returns correct value."""
self.assertEqual(self.driver.transport, 'tcpkiss')
def test_is_enabled_true(self):
"""Test is_enabled returns True when KISS TCP is enabled."""
self.mock_conf.kiss_tcp.enabled = True
self.assertTrue(TCPKISSDriver.is_enabled())
def test_is_enabled_false(self):
"""Test is_enabled returns False when KISS TCP is disabled."""
self.mock_conf.kiss_tcp.enabled = False
self.assertFalse(TCPKISSDriver.is_enabled())
def test_is_configured_true(self):
"""Test is_configured returns True when properly configured."""
with mock.patch.object(TCPKISSDriver, 'is_enabled', return_value=True):
self.mock_conf.kiss_tcp.host = '127.0.0.1'
self.assertTrue(TCPKISSDriver.is_configured())
def test_is_configured_false_no_host(self):
"""Test is_configured returns False when host not set."""
with mock.patch.object(TCPKISSDriver, 'is_enabled', return_value=True):
self.mock_conf.kiss_tcp.host = None
with self.assertRaises(exception.MissingConfigOptionException):
TCPKISSDriver.is_configured()
def test_is_configured_false_not_enabled(self):
"""Test is_configured returns False when not enabled."""
with mock.patch.object(TCPKISSDriver, 'is_enabled', return_value=False):
self.assertFalse(TCPKISSDriver.is_configured())
def test_is_alive(self):
"""Test is_alive property returns connection state."""
self.driver._connected = True
self.assertTrue(self.driver.is_alive)
self.driver._connected = False
self.assertFalse(self.driver.is_alive)
def test_close(self):
"""Test close method calls stop."""
with mock.patch.object(self.driver, 'stop') as mock_stop:
self.driver.close()
mock_stop.assert_called_once()
@mock.patch('aprsd.client.drivers.tcpkiss.LOG')
def test_setup_connection_success(self, mock_log):
"""Test setup_connection successfully connects."""
# Mock the connect method to succeed
is_en = self.driver.is_enabled
is_con = self.driver.is_configured
self.driver.is_enabled = mock.MagicMock(return_value=True)
self.driver.is_configured = mock.MagicMock(return_value=True)
with mock.patch.object(
self.driver, 'connect', return_value=True
) as mock_connect:
self.driver.setup_connection()
mock_connect.assert_called_once()
mock_log.info.assert_called_with('KISS TCP Connection to 127.0.0.1:8001')
self.driver.is_enabled = is_en
self.driver.is_configured = is_con
@mock.patch('aprsd.client.drivers.tcpkiss.LOG')
def test_setup_connection_failure(self, mock_log):
"""Test setup_connection handles connection failure."""
# Mock the connect method to fail
with mock.patch.object(
self.driver, 'connect', return_value=False
) as mock_connect:
self.driver.setup_connection()
mock_connect.assert_called_once()
mock_log.error.assert_called_with('Failed to connect to KISS interface')
@mock.patch('aprsd.client.drivers.tcpkiss.LOG')
def test_setup_connection_exception(self, mock_log):
"""Test setup_connection handles exceptions."""
# Mock the connect method to raise an exception
with mock.patch.object(
self.driver, 'connect', side_effect=Exception('Test error')
) as mock_connect:
self.driver.setup_connection()
mock_connect.assert_called_once()
mock_log.error.assert_any_call('Failed to initialize KISS interface')
mock_log.exception.assert_called_once()
self.assertFalse(self.driver._connected)
def test_set_filter(self):
"""Test set_filter does nothing for KISS."""
# Just ensure it doesn't fail
self.driver.set_filter('test/filter')
def test_login_success_when_connected(self):
"""Test login_success returns True when connected."""
self.driver._connected = True
self.assertTrue(self.driver.login_success())
def test_login_success_when_not_connected(self):
"""Test login_success returns False when not connected."""
self.driver._connected = False
self.assertFalse(self.driver.login_success())
def test_login_failure(self):
"""Test login_failure returns success message."""
self.assertEqual(self.driver.login_failure(), 'Login successful')
@mock.patch('aprsd.client.drivers.tcpkiss.ax25frame.Frame.ui')
def test_send_packet(self, mock_frame_ui):
"""Test sending an APRS packet."""
# Create a mock frame
mock_frame = mock.MagicMock()
mock_frame_bytes = b'mock_frame_data'
mock_frame.__bytes__ = mock.MagicMock(return_value=mock_frame_bytes)
mock_frame_ui.return_value = mock_frame
# Set up the driver
self.driver.socket = self.mock_socket
self.driver.path = ['WIDE1-1', 'WIDE2-1']
# Create a mock packet
mock_packet = mock.MagicMock(spec=core.Packet)
mock_bytes = b'Test packet data'
mock_packet.__bytes__ = mock.MagicMock(return_value=mock_bytes)
# Add path attribute to the mock packet
mock_packet.path = None
# Send the packet
self.driver.send(mock_packet)
# Check that frame was created correctly
mock_frame_ui.assert_called_once_with(
destination='APZ100',
source=mock_packet.from_call,
path=self.driver.path,
info=mock_packet.payload.encode('US-ASCII'),
)
# Check that socket send was called
self.mock_socket.send.assert_called_once()
# Verify packet counters updated
self.assertEqual(self.driver.packets_sent, 1)
self.assertIsNotNone(self.driver.last_packet_sent)
def test_send_with_no_socket(self):
"""Test send raises exception when socket not initialized."""
self.driver.socket = None
mock_packet = mock.MagicMock(spec=core.Packet)
with self.assertRaises(Exception) as context:
self.driver.send(mock_packet)
self.assertIn('KISS interface not initialized', str(context.exception))
def test_stop(self):
"""Test stop method cleans up properly."""
self.driver._running = True
self.driver._connected = True
self.driver.socket = self.mock_socket
self.driver.stop()
self.assertFalse(self.driver._running)
self.assertFalse(self.driver._connected)
self.mock_socket.close.assert_called_once()
def test_stats(self):
"""Test stats method returns correct data."""
# Set up test data
self.driver._connected = True
self.driver.path = ['WIDE1-1', 'WIDE2-1']
self.driver.packets_sent = 5
self.driver.packets_received = 3
self.driver.last_packet_sent = datetime.datetime.now()
self.driver.last_packet_received = datetime.datetime.now()
# Get stats
stats = self.driver.stats()
# Check stats contains expected keys
expected_keys = [
'client',
'transport',
'connected',
'path',
'packets_sent',
'packets_received',
'last_packet_sent',
'last_packet_received',
'connection_keepalive',
'host',
'port',
]
for key in expected_keys:
self.assertIn(key, stats)
# Check some specific values
self.assertEqual(stats['client'], 'TCPKISSDriver')
self.assertEqual(stats['transport'], 'tcpkiss')
self.assertEqual(stats['connected'], True)
self.assertEqual(stats['packets_sent'], 5)
self.assertEqual(stats['packets_received'], 3)
def test_stats_serializable(self):
"""Test stats with serializable=True converts datetime to ISO format."""
self.driver.keepalive = datetime.datetime.now()
stats = self.driver.stats(serializable=True)
# Check keepalive is a string in ISO format
self.assertIsInstance(stats['connection_keepalive'], str)
# Try parsing it to verify it's a valid ISO format
try:
datetime.datetime.fromisoformat(stats['connection_keepalive'])
except ValueError:
self.fail('keepalive is not in valid ISO format')
def test_connect_success(self):
"""Test successful connection."""
result = self.driver.connect()
self.assertTrue(result)
self.assertTrue(self.driver._connected)
self.mock_socket.connect.assert_called_once_with(
(self.mock_conf.kiss_tcp.host, self.mock_conf.kiss_tcp.port)
)
self.mock_socket.settimeout.assert_any_call(5.0)
self.mock_socket.settimeout.assert_any_call(0.1)
def test_connect_failure_socket_error(self):
"""Test connection failure due to socket error."""
self.mock_socket.connect.side_effect = socket.error('Test socket error')
result = self.driver.connect()
self.assertFalse(result)
self.assertFalse(self.driver._connected)
def test_connect_failure_timeout(self):
"""Test connection failure due to timeout."""
self.mock_socket.connect.side_effect = socket.timeout('Test timeout')
result = self.driver.connect()
self.assertFalse(result)
self.assertFalse(self.driver._connected)
def test_fix_raw_frame(self):
"""Test fix_raw_frame removes KISS markers and handles FEND."""
# Create a test frame with KISS markers
with mock.patch(
'aprsd.client.drivers.tcpkiss.handle_fend', return_value=b'fixed_frame'
) as mock_handle_fend:
raw_frame = b'\xc0\x00some_frame_data\xc0' # \xc0 is FEND
result = self.driver.fix_raw_frame(raw_frame)
mock_handle_fend.assert_called_once_with(b'some_frame_data')
self.assertEqual(result, b'fixed_frame')
@mock.patch('aprsd.client.drivers.tcpkiss.LOG')
def test_decode_packet_success(self, mock_log):
"""Test successful packet decoding."""
mock_frame = 'test frame data'
mock_aprs_data = {'from': 'TEST-1', 'to': 'APRS'}
mock_packet = mock.MagicMock(spec=core.Packet)
with mock.patch(
'aprsd.client.drivers.tcpkiss.aprslib.parse', return_value=mock_aprs_data
) as mock_parse:
with mock.patch(
'aprsd.client.drivers.tcpkiss.core.factory', return_value=mock_packet
) as mock_factory:
result = self.driver.decode_packet(frame=mock_frame)
mock_parse.assert_called_once_with(str(mock_frame))
mock_factory.assert_called_once_with(mock_aprs_data)
self.assertEqual(result, mock_packet)
@mock.patch('aprsd.client.drivers.tcpkiss.LOG')
def test_decode_packet_no_frame(self, mock_log):
"""Test decode_packet with no frame returns None."""
result = self.driver.decode_packet()
self.assertIsNone(result)
mock_log.warning.assert_called_once()
@mock.patch('aprsd.client.drivers.tcpkiss.LOG')
def test_decode_packet_exception(self, mock_log):
"""Test decode_packet handles exceptions."""
mock_frame = 'invalid frame'
with mock.patch(
'aprsd.client.drivers.tcpkiss.aprslib.parse',
side_effect=Exception('Test error'),
) as mock_parse:
result = self.driver.decode_packet(frame=mock_frame)
mock_parse.assert_called_once()
self.assertIsNone(result)
mock_log.error.assert_called_once()
@mock.patch('aprsd.client.drivers.tcpkiss.LOG')
def test_consumer_with_frame(self, mock_log):
"""Test consumer processes frames and calls callback."""
mock_callback = mock.MagicMock()
mock_frame = mock.MagicMock()
# Configure driver for test
self.driver._connected = True
self.driver._running = True
# Set up read_frame to return one frame then stop
def side_effect():
self.driver._running = False
return mock_frame
with mock.patch.object(
self.driver, 'read_frame', side_effect=side_effect
) as mock_read_frame:
self.driver.consumer(mock_callback)
mock_read_frame.assert_called_once()
mock_callback.assert_called_once_with(frame=mock_frame)
@mock.patch('aprsd.client.drivers.tcpkiss.LOG')
def test_consumer_with_connect_reconnect(self, mock_log):
"""Test consumer tries to reconnect when not connected."""
mock_callback = mock.MagicMock()
# Configure driver for test
self.driver._connected = False
# Setup to run once then stop
call_count = 0
def connect_side_effect():
nonlocal call_count
call_count += 1
# On second call, connect successfully
if call_count == 2:
self.driver._running = False
self.driver.socket = self.mock_socket
return True
return False
with mock.patch.object(
self.driver, 'connect', side_effect=connect_side_effect
) as mock_connect:
with mock.patch('aprsd.client.drivers.tcpkiss.time.sleep') as mock_sleep:
self.driver.consumer(mock_callback)
self.assertEqual(mock_connect.call_count, 2)
mock_sleep.assert_called_once_with(1)
@mock.patch('aprsd.client.drivers.tcpkiss.LOG')
def test_read_frame_success(self, mock_log):
"""Test read_frame successfully reads a frame."""
# Set up driver
self.driver.socket = self.mock_socket
self.driver._running = True
# Mock socket recv to return data
raw_data = b'\xc0\x00test_frame\xc0'
self.mock_socket.recv.return_value = raw_data
# Mock select to indicate socket is readable
self.mock_select.select.return_value = ([self.mock_socket], [], [])
# Mock fix_raw_frame and Frame.from_bytes
mock_fixed_frame = b'fixed_frame'
mock_ax25_frame = mock.MagicMock()
with mock.patch.object(
self.driver, 'fix_raw_frame', return_value=mock_fixed_frame
) as mock_fix:
with mock.patch(
'aprsd.client.drivers.tcpkiss.ax25frame.Frame.from_bytes',
return_value=mock_ax25_frame,
) as mock_from_bytes:
result = self.driver.read_frame()
self.mock_socket.setblocking.assert_called_once_with(0)
self.mock_select.select.assert_called_once()
self.mock_socket.recv.assert_called_once()
mock_fix.assert_called_once_with(raw_data)
mock_from_bytes.assert_called_once_with(mock_fixed_frame)
self.assertEqual(result, mock_ax25_frame)
@mock.patch('aprsd.client.drivers.tcpkiss.LOG')
def test_read_frame_select_timeout(self, mock_log):
"""Test read_frame handles select timeout."""
# Set up driver
self.driver.socket = self.mock_socket
self.driver._running = True
# Mock select to indicate no readable sockets
self.mock_select.select.return_value = ([], [], [])
result = self.driver.read_frame()
self.assertIsNone(result)
@mock.patch('aprsd.client.drivers.tcpkiss.LOG')
def test_read_frame_socket_error(self, mock_log):
"""Test read_frame handles socket error."""
# Set up driver
self.driver.socket = self.mock_socket
self.driver._running = True
# Mock setblocking to raise OSError
self.mock_socket.setblocking.side_effect = OSError('Test error')
with self.assertRaises(aprslib.ConnectionDrop):
self.driver.read_frame()
mock_log.error.assert_called_once()
if __name__ == '__main__':
unittest.main()

View File

@ -1,89 +0,0 @@
import datetime
import unittest
from unittest import mock
from aprsd import exception
from aprsd.client.aprsis import APRSISClient
class TestAPRSISClient(unittest.TestCase):
"""Test cases for APRSISClient."""
def setUp(self):
"""Set up test fixtures."""
super().setUp()
# Mock the config
self.mock_conf = mock.MagicMock()
self.mock_conf.aprs_network.enabled = True
self.mock_conf.aprs_network.login = "TEST"
self.mock_conf.aprs_network.password = "12345"
self.mock_conf.aprs_network.host = "localhost"
self.mock_conf.aprs_network.port = 14580
@mock.patch("aprsd.client.base.APRSClient")
@mock.patch("aprsd.client.drivers.aprsis.Aprsdis")
def test_stats_not_configured(self, mock_aprsdis, mock_base):
"""Test stats when client is not configured."""
mock_client = mock.MagicMock()
mock_aprsdis.return_value = mock_client
with mock.patch("aprsd.client.aprsis.cfg.CONF", self.mock_conf):
self.client = APRSISClient()
with mock.patch.object(APRSISClient, "is_configured", return_value=False):
stats = self.client.stats()
self.assertEqual({}, stats)
@mock.patch("aprsd.client.base.APRSClient")
@mock.patch("aprsd.client.drivers.aprsis.Aprsdis")
def test_stats_configured(self, mock_aprsdis, mock_base):
"""Test stats when client is configured."""
mock_client = mock.MagicMock()
mock_aprsdis.return_value = mock_client
with mock.patch("aprsd.client.aprsis.cfg.CONF", self.mock_conf):
self.client = APRSISClient()
mock_client = mock.MagicMock()
mock_client.server_string = "test.server:14580"
mock_client.aprsd_keepalive = datetime.datetime.now()
self.client._client = mock_client
self.client.filter = "m/50"
with mock.patch.object(APRSISClient, "is_configured", return_value=True):
stats = self.client.stats()
from rich.console import Console
c = Console()
c.print(stats)
self.assertEqual(
{
"connected": True,
"filter": "m/50",
"login_status": {"message": mock.ANY, "success": True},
"connection_keepalive": mock_client.aprsd_keepalive,
"server_string": mock_client.server_string,
"transport": "aprsis",
},
stats,
)
def test_is_configured_missing_login(self):
"""Test is_configured with missing login."""
self.mock_conf.aprs_network.login = None
with self.assertRaises(exception.MissingConfigOptionException):
APRSISClient.is_configured()
def test_is_configured_missing_password(self):
"""Test is_configured with missing password."""
self.mock_conf.aprs_network.password = None
with self.assertRaises(exception.MissingConfigOptionException):
APRSISClient.is_configured()
def test_is_configured_missing_host(self):
"""Test is_configured with missing host."""
self.mock_conf.aprs_network.host = None
with mock.patch("aprsd.client.aprsis.cfg.CONF", self.mock_conf):
with self.assertRaises(exception.MissingConfigOptionException):
APRSISClient.is_configured()

View File

@ -1,141 +0,0 @@
import unittest
from unittest import mock
from aprsd.client.base import APRSClient
from aprsd.packets import core
class MockAPRSClient(APRSClient):
"""Concrete implementation of APRSClient for testing."""
def stats(self):
return {"packets_received": 0, "packets_sent": 0}
def setup_connection(self):
mock_connection = mock.MagicMock()
# Configure the mock with required methods
mock_connection.close = mock.MagicMock()
mock_connection.stop = mock.MagicMock()
mock_connection.set_filter = mock.MagicMock()
mock_connection.send = mock.MagicMock()
self._client = mock_connection
return mock_connection
def decode_packet(self, *args, **kwargs):
return mock.MagicMock()
def consumer(self, callback, blocking=False, immortal=False, raw=False):
pass
def is_alive(self):
return True
def close(self):
pass
@staticmethod
def is_enabled():
return True
@staticmethod
def transport():
return "mock"
def reset(self):
"""Mock implementation of reset."""
if self._client:
self._client.close()
self._client = self.setup_connection()
if self.filter:
self._client.set_filter(self.filter)
class TestAPRSClient(unittest.TestCase):
def setUp(self):
# Reset the singleton instance before each test
APRSClient._instance = None
APRSClient._client = None
self.client = MockAPRSClient()
def test_singleton_pattern(self):
"""Test that multiple instantiations return the same instance."""
client1 = MockAPRSClient()
client2 = MockAPRSClient()
self.assertIs(client1, client2)
def test_set_filter(self):
"""Test setting APRS filter."""
# Get the existing mock client that was created in __init__
mock_client = self.client._client
test_filter = "m/50"
self.client.set_filter(test_filter)
self.assertEqual(self.client.filter, test_filter)
# The filter is set once during set_filter() and once during reset()
mock_client.set_filter.assert_called_with(test_filter)
@mock.patch("aprsd.client.base.LOG")
def test_reset(self, mock_log):
"""Test client reset functionality."""
# Create a new mock client with the necessary methods
old_client = mock.MagicMock()
self.client._client = old_client
self.client.reset()
# Verify the old client was closed
old_client.close.assert_called_once()
# Verify a new client was created
self.assertIsNotNone(self.client._client)
self.assertNotEqual(old_client, self.client._client)
def test_send_packet(self):
"""Test sending an APRS packet."""
mock_packet = mock.Mock(spec=core.Packet)
self.client.send(mock_packet)
self.client._client.send.assert_called_once_with(mock_packet)
def test_stop(self):
"""Test stopping the client."""
# Ensure client is created first
self.client._create_client()
self.client.stop()
self.client._client.stop.assert_called_once()
@mock.patch("aprsd.client.base.LOG")
def test_create_client_failure(self, mock_log):
"""Test handling of client creation failure."""
# Make setup_connection raise an exception
with mock.patch.object(
self.client,
"setup_connection",
side_effect=Exception("Connection failed"),
):
with self.assertRaises(Exception):
self.client._create_client()
self.assertIsNone(self.client._client)
mock_log.error.assert_called_once()
def test_client_property(self):
"""Test the client property creates client if none exists."""
self.client._client = None
client = self.client.client
self.assertIsNotNone(client)
def test_filter_applied_on_creation(self):
"""Test that filter is applied when creating new client."""
test_filter = "m/50"
self.client.set_filter(test_filter)
# Force client recreation
self.client.reset()
# Verify filter was applied to new client
self.client._client.set_filter.assert_called_with(test_filter)
if __name__ == "__main__":
unittest.main()

View File

@ -1,75 +0,0 @@
import unittest
from unittest import mock
from aprsd.client.factory import Client, ClientFactory
class MockClient:
"""Mock client for testing."""
@classmethod
def is_enabled(cls):
return True
@classmethod
def is_configured(cls):
return True
class TestClientFactory(unittest.TestCase):
"""Test cases for ClientFactory."""
def setUp(self):
"""Set up test fixtures."""
self.factory = ClientFactory()
# Clear any registered clients from previous tests
self.factory.clients = []
def test_singleton(self):
"""Test that ClientFactory is a singleton."""
factory2 = ClientFactory()
self.assertEqual(self.factory, factory2)
def test_register_client(self):
"""Test registering a client."""
self.factory.register(MockClient)
self.assertIn(MockClient, self.factory.clients)
def test_register_invalid_client(self):
"""Test registering an invalid client raises error."""
invalid_client = mock.MagicMock(spec=Client)
with self.assertRaises(ValueError):
self.factory.register(invalid_client)
def test_create_client(self):
"""Test creating a client."""
self.factory.register(MockClient)
client = self.factory.create()
self.assertIsInstance(client, MockClient)
def test_create_no_clients(self):
"""Test creating a client with no registered clients."""
with self.assertRaises(Exception):
self.factory.create()
def test_is_client_enabled(self):
"""Test checking if any client is enabled."""
self.factory.register(MockClient)
self.assertTrue(self.factory.is_client_enabled())
def test_is_client_enabled_none(self):
"""Test checking if any client is enabled when none are."""
MockClient.is_enabled = classmethod(lambda cls: False)
self.factory.register(MockClient)
self.assertFalse(self.factory.is_client_enabled())
def test_is_client_configured(self):
"""Test checking if any client is configured."""
self.factory.register(MockClient)
self.assertTrue(self.factory.is_client_configured())
def test_is_client_configured_none(self):
"""Test checking if any client is configured when none are."""
MockClient.is_configured = classmethod(lambda cls: False)
self.factory.register(MockClient)
self.assertFalse(self.factory.is_client_configured())

View File

@ -0,0 +1,100 @@
import unittest
from unittest import mock
from aprsd.client.drivers.registry import DriverRegistry
from ..mock_client_driver import MockClientDriver
class TestDriverRegistry(unittest.TestCase):
"""Unit tests for the DriverRegistry class."""
def setUp(self):
# Reset the singleton instance before each test
DriverRegistry._singleton_instances = {}
self.registry = DriverRegistry()
self.registry.drivers = []
# Mock APRSISDriver completely
self.aprsis_patcher = mock.patch('aprsd.client.drivers.aprsis.APRSISDriver')
mock_aprsis_class = self.aprsis_patcher.start()
mock_aprsis_class.is_enabled.return_value = False
mock_aprsis_class.is_configured.return_value = False
# Mock the instance methods as well
mock_instance = mock_aprsis_class.return_value
mock_instance.is_enabled.return_value = False
mock_instance.is_configured.return_value = False
# Mock CONF to prevent password check
self.conf_patcher = mock.patch('aprsd.client.drivers.aprsis.CONF')
mock_conf = self.conf_patcher.start()
mock_conf.aprs_network.password = 'dummy'
mock_conf.aprs_network.login = 'dummy'
def tearDown(self):
# Reset the singleton instance after each test
DriverRegistry().drivers = []
self.aprsis_patcher.stop()
self.conf_patcher.stop()
def test_get_driver_with_valid_driver(self):
"""Test getting an enabled and configured driver."""
# Add an enabled and configured driver
driver = MockClientDriver
driver.is_enabled = mock.MagicMock(return_value=True)
driver.is_configured = mock.MagicMock(return_value=True)
self.registry.register(MockClientDriver)
# Get the driver
result = self.registry.get_driver()
print(result)
self.assertTrue(isinstance(result, MockClientDriver))
def test_get_driver_with_disabled_driver(self):
"""Test getting a driver when only disabled drivers exist."""
driver = MockClientDriver
driver.is_enabled = mock.MagicMock(return_value=False)
driver.is_configured = mock.MagicMock(return_value=False)
self.registry.register(driver)
with self.assertRaises(ValueError) as context:
self.registry.get_driver()
self.assertIn('No enabled driver found', str(context.exception))
def test_get_driver_with_unconfigured_driver(self):
"""Test getting a driver when only unconfigured drivers exist."""
driver = MockClientDriver
driver.is_enabled = mock.MagicMock(return_value=True)
driver.is_configured = mock.MagicMock(return_value=False)
self.registry.register(driver)
with self.assertRaises(ValueError) as context:
self.registry.get_driver()
self.assertIn('No enabled driver found', str(context.exception))
def test_get_driver_with_no_drivers(self):
"""Test getting a driver when no drivers exist."""
# Try to get a driver
with self.assertRaises(ValueError) as context:
self.registry.get_driver()
self.assertIn('No enabled driver found', str(context.exception))
def test_get_driver_with_multiple_drivers(self):
"""Test getting a driver when multiple valid drivers exist."""
# Add multiple drivers
driver1 = MockClientDriver
driver1.is_enabled = mock.MagicMock(return_value=True)
driver1.is_configured = mock.MagicMock(return_value=True)
driver2 = MockClientDriver
self.registry.register(driver1)
self.registry.register(driver2)
# Get the driver - should return the first one
result = self.registry.get_driver()
# We can only check that it's a MockDriver instance
self.assertTrue(isinstance(result, MockClientDriver))
if __name__ == '__main__':
unittest.main()

View File

@ -0,0 +1,76 @@
from unittest import mock
from aprsd.packets import core
class MockClientDriver:
"""Mock implementation of ClientDriver for testing."""
def __init__(self, enabled=True, configured=True):
self.connected = False
self._alive = True
self._keepalive = None
self.filter = None
self._enabled = enabled
self._configured = configured
self.path = '/dev/ttyUSB0'
self.login_status = {
'success': True,
'message': None,
}
@staticmethod
def is_enabled():
"""Static method to check if driver is enabled."""
return True
@staticmethod
def is_configured():
"""Static method to check if driver is configured."""
return True
def is_alive(self):
"""Instance method to check if driver is alive."""
return self._alive
def stats(self, serializable=False):
"""Return mock stats."""
stats = {'packets_received': 0, 'packets_sent': 0}
if serializable:
stats['path'] = self.path
return stats
@property
def login_success(self):
"""Property to get login success status."""
return self.login_status['success']
@property
def login_failure(self):
"""Property to get login failure message."""
return self.login_status['message']
def decode_packet(self, *args, **kwargs):
"""Mock packet decoding."""
packet = mock.MagicMock(spec=core.Packet)
packet.raw = 'test packet'
return packet
def close(self):
self.connected = False
def setup_connection(self):
self.connected = True
def send(self, packet):
return True
def set_filter(self, filter_str):
self.filter = filter_str
@property
def keepalive(self):
return self._keepalive
def consumer(self, callback, raw=False):
pass

View File

@ -7,9 +7,11 @@ from aprsd import ( # noqa: F401
conf,
packets,
)
from aprsd.client.drivers.registry import DriverRegistry
from aprsd.plugins import notify as notify_plugin
from .. import fake, test_plugin
from ..mock_client_driver import MockClientDriver
CONF = cfg.CONF
DEFAULT_WATCHLIST_CALLSIGNS = fake.FAKE_FROM_CALLSIGN
@ -17,9 +19,24 @@ DEFAULT_WATCHLIST_CALLSIGNS = fake.FAKE_FROM_CALLSIGN
class TestWatchListPlugin(test_plugin.TestPlugin):
def setUp(self):
super().setUp()
self.fromcall = fake.FAKE_FROM_CALLSIGN
self.ack = 1
# Mock APRSISDriver
self.aprsis_patcher = mock.patch('aprsd.client.drivers.aprsis.APRSISDriver')
self.mock_aprsis = self.aprsis_patcher.start()
self.mock_aprsis.is_enabled.return_value = False
self.mock_aprsis.is_configured.return_value = False
# Register the mock driver
DriverRegistry().register(MockClientDriver)
def tearDown(self):
super().tearDown()
if hasattr(self, 'aprsis_patcher'):
self.aprsis_patcher.stop()
def config_and_init(
self,
watchlist_enabled=True,
@ -30,7 +47,9 @@ class TestWatchListPlugin(test_plugin.TestPlugin):
):
CONF.callsign = self.fromcall
CONF.aprs_network.login = self.fromcall
CONF.aprs_fi.apiKey = "something"
CONF.aprs_fi.apiKey = 'something'
# Add mock password
CONF.aprs_network.password = '12345'
# Set the watchlist specific config options
CONF.watch_list.enabled = watchlist_enabled
@ -56,22 +75,20 @@ class TestAPRSDWatchListPluginBase(TestWatchListPlugin):
plugin = fake.FakeWatchListPlugin()
packet = fake.fake_packet(
message="version",
message='version',
msg_number=1,
)
actual = plugin.filter(packet)
expected = packets.NULL_MESSAGE
self.assertEqual(expected, actual)
@mock.patch("aprsd.client.factory.ClientFactory", autospec=True)
def test_watchlist_not_in_watchlist(self, mock_factory):
client.client_factory = mock_factory
def test_watchlist_not_in_watchlist(self):
self.config_and_init()
plugin = fake.FakeWatchListPlugin()
packet = fake.fake_packet(
fromcall="FAKE",
message="version",
fromcall='FAKE',
message='version',
msg_number=1,
)
actual = plugin.filter(packet)
@ -85,87 +102,77 @@ class TestNotifySeenPlugin(TestWatchListPlugin):
plugin = notify_plugin.NotifySeenPlugin()
packet = fake.fake_packet(
message="version",
message='version',
msg_number=1,
)
actual = plugin.filter(packet)
expected = packets.NULL_MESSAGE
self.assertEqual(expected, actual)
@mock.patch("aprsd.client.factory.ClientFactory", autospec=True)
def test_callsign_not_in_watchlist(self, mock_factory):
client.client_factory = mock_factory
def test_callsign_not_in_watchlist(self):
self.config_and_init(watchlist_enabled=False)
plugin = notify_plugin.NotifySeenPlugin()
packet = fake.fake_packet(
message="version",
message='version',
msg_number=1,
)
actual = plugin.filter(packet)
expected = packets.NULL_MESSAGE
self.assertEqual(expected, actual)
@mock.patch("aprsd.client.factory.ClientFactory", autospec=True)
@mock.patch("aprsd.packets.WatchList.is_old")
def test_callsign_in_watchlist_not_old(self, mock_is_old, mock_factory):
client.client_factory = mock_factory
@mock.patch('aprsd.packets.WatchList.is_old')
def test_callsign_in_watchlist_not_old(self, mock_is_old):
mock_is_old.return_value = False
self.config_and_init(
watchlist_enabled=True,
watchlist_callsigns=["WB4BOR"],
watchlist_callsigns=['WB4BOR'],
)
plugin = notify_plugin.NotifySeenPlugin()
packet = fake.fake_packet(
fromcall="WB4BOR",
message="ping",
fromcall='WB4BOR',
message='ping',
msg_number=1,
)
actual = plugin.filter(packet)
expected = packets.NULL_MESSAGE
self.assertEqual(expected, actual)
@mock.patch("aprsd.client.factory.ClientFactory", autospec=True)
@mock.patch("aprsd.packets.WatchList.is_old")
def test_callsign_in_watchlist_old_same_alert_callsign(
self, mock_is_old, mock_factory
):
client.client_factory = mock_factory
@mock.patch('aprsd.packets.WatchList.is_old')
def test_callsign_in_watchlist_old_same_alert_callsign(self, mock_is_old):
mock_is_old.return_value = True
self.config_and_init(
watchlist_enabled=True,
watchlist_alert_callsign="WB4BOR",
watchlist_callsigns=["WB4BOR"],
watchlist_alert_callsign='WB4BOR',
watchlist_callsigns=['WB4BOR'],
)
plugin = notify_plugin.NotifySeenPlugin()
packet = fake.fake_packet(
fromcall="WB4BOR",
message="ping",
fromcall='WB4BOR',
message='ping',
msg_number=1,
)
actual = plugin.filter(packet)
expected = packets.NULL_MESSAGE
self.assertEqual(expected, actual)
@mock.patch("aprsd.client.factory.ClientFactory", autospec=True)
@mock.patch("aprsd.packets.WatchList.is_old")
def test_callsign_in_watchlist_old_send_alert(self, mock_is_old, mock_factory):
client.client_factory = mock_factory
@mock.patch('aprsd.packets.WatchList.is_old')
def test_callsign_in_watchlist_old_send_alert(self, mock_is_old):
mock_is_old.return_value = True
notify_callsign = fake.FAKE_TO_CALLSIGN
fromcall = "WB4BOR"
fromcall = 'WB4BOR'
self.config_and_init(
watchlist_enabled=True,
watchlist_alert_callsign=notify_callsign,
watchlist_callsigns=["WB4BOR"],
watchlist_callsigns=['WB4BOR'],
)
plugin = notify_plugin.NotifySeenPlugin()
packet = fake.fake_packet(
fromcall=fromcall,
message="ping",
message='ping',
msg_number=1,
)
packet_type = packet.__class__.__name__

View File

@ -3,6 +3,7 @@ from unittest import mock
from oslo_config import cfg
import aprsd
from aprsd.client.drivers.fake import APRSDFakeDriver
from aprsd.plugins import version as version_plugin
from .. import fake, test_plugin
@ -11,16 +12,41 @@ CONF = cfg.CONF
class TestVersionPlugin(test_plugin.TestPlugin):
@mock.patch("aprsd.stats.app.APRSDStats.uptime")
def test_version(self, mock_stats):
mock_stats.return_value = "00:00:00"
expected = f"APRSD ver:{aprsd.__version__} uptime:00:00:00"
def setUp(self):
# make sure the fake client driver is enabled
# Mock CONF for testing
super().setUp()
self.conf_patcher = mock.patch('aprsd.client.drivers.fake.CONF')
self.mock_conf = self.conf_patcher.start()
# Configure fake_client.enabled
self.mock_conf.fake_client.enabled = True
# Create an instance of the driver
self.driver = APRSDFakeDriver()
self.fromcall = fake.FAKE_FROM_CALLSIGN
def tearDown(self):
self.conf_patcher.stop()
super().tearDown()
@mock.patch('aprsd.stats.collector.Collector')
def test_version(self, mock_collector_class):
# Set up the mock collector instance
mock_collector_instance = mock_collector_class.return_value
mock_collector_instance.collect.return_value = {
'APRSDStats': {
'uptime': '00:00:00',
}
}
expected = f'APRSD ver:{aprsd.__version__} uptime:00:00:00'
CONF.callsign = fake.FAKE_TO_CALLSIGN
version = version_plugin.VersionPlugin()
version.enabled = True
packet = fake.fake_packet(
message="No",
message='No',
msg_number=1,
)
@ -28,8 +54,11 @@ class TestVersionPlugin(test_plugin.TestPlugin):
self.assertEqual(None, actual)
packet = fake.fake_packet(
message="version",
message='version',
msg_number=1,
)
actual = version.filter(packet)
self.assertEqual(expected, actual)
# Verify the mock was called exactly once
mock_collector_instance.collect.assert_called_once()

View File

@ -9,9 +9,11 @@ from aprsd import ( # noqa: F401
plugins,
)
from aprsd import plugin as aprsd_plugin
from aprsd.client.drivers.registry import DriverRegistry
from aprsd.packets import core
from . import fake
from .mock_client_driver import MockClientDriver
CONF = cfg.CONF
@ -21,15 +23,24 @@ class TestPluginManager(unittest.TestCase):
self.fromcall = fake.FAKE_FROM_CALLSIGN
self.config_and_init()
self.mock_driver = MockClientDriver()
# Mock the DriverRegistry to return our mock driver
self.registry_patcher = mock.patch.object(
DriverRegistry, 'get_driver', return_value=self.mock_driver
)
self.mock_registry = self.registry_patcher.start()
def tearDown(self) -> None:
self.config = None
aprsd_plugin.PluginManager._instance = None
self.registry_patcher.stop()
self.mock_registry.stop()
def config_and_init(self):
CONF.callsign = self.fromcall
CONF.aprs_network.login = fake.FAKE_TO_CALLSIGN
CONF.aprs_fi.apiKey = "something"
CONF.enabled_plugins = "aprsd.plugins.ping.PingPlugin"
CONF.aprs_fi.apiKey = 'something'
CONF.enabled_plugins = 'aprsd.plugins.ping.PingPlugin'
CONF.enable_save = False
def test_get_plugins_no_plugins(self):
@ -39,7 +50,7 @@ class TestPluginManager(unittest.TestCase):
self.assertEqual([], plugin_list)
def test_get_plugins_with_plugins(self):
CONF.enabled_plugins = ["aprsd.plugins.ping.PingPlugin"]
CONF.enabled_plugins = ['aprsd.plugins.ping.PingPlugin']
pm = aprsd_plugin.PluginManager()
plugin_list = pm.get_plugins()
self.assertEqual([], plugin_list)
@ -64,7 +75,7 @@ class TestPluginManager(unittest.TestCase):
self.assertEqual(0, len(plugin_list))
def test_get_message_plugins(self):
CONF.enabled_plugins = ["aprsd.plugins.ping.PingPlugin"]
CONF.enabled_plugins = ['aprsd.plugins.ping.PingPlugin']
pm = aprsd_plugin.PluginManager()
plugin_list = pm.get_plugins()
self.assertEqual([], plugin_list)
@ -87,22 +98,31 @@ class TestPlugin(unittest.TestCase):
self.ack = 1
self.config_and_init()
self.mock_driver = MockClientDriver()
# Mock the DriverRegistry to return our mock driver
self.registry_patcher = mock.patch.object(
DriverRegistry, 'get_driver', return_value=self.mock_driver
)
self.mock_registry = self.registry_patcher.start()
def tearDown(self) -> None:
packets.WatchList._instance = None
packets.SeenList._instance = None
packets.PacketTrack._instance = None
self.config = None
self.registry_patcher.stop()
self.mock_registry.stop()
def config_and_init(self):
CONF.callsign = self.fromcall
CONF.aprs_network.login = fake.FAKE_TO_CALLSIGN
CONF.aprs_fi.apiKey = "something"
CONF.enabled_plugins = "aprsd.plugins.ping.PingPlugin"
CONF.aprs_fi.apiKey = 'something'
CONF.enabled_plugins = 'aprsd.plugins.ping.PingPlugin'
CONF.enable_save = False
class TestPluginBase(TestPlugin):
@mock.patch.object(fake.FakeBaseNoThreadsPlugin, "process")
@mock.patch.object(fake.FakeBaseNoThreadsPlugin, 'process')
def test_base_plugin_no_threads(self, mock_process):
p = fake.FakeBaseNoThreadsPlugin()
@ -110,7 +130,7 @@ class TestPluginBase(TestPlugin):
actual = p.create_threads()
self.assertEqual(expected, actual)
expected = "1.0"
expected = '1.0'
actual = p.version
self.assertEqual(expected, actual)
@ -123,7 +143,7 @@ class TestPluginBase(TestPlugin):
self.assertEqual(expected, actual)
mock_process.assert_not_called()
@mock.patch.object(fake.FakeBaseThreadsPlugin, "create_threads")
@mock.patch.object(fake.FakeBaseThreadsPlugin, 'create_threads')
def test_base_plugin_threads_created(self, mock_create):
p = fake.FakeBaseThreadsPlugin()
mock_create.assert_called_once()
@ -135,17 +155,17 @@ class TestPluginBase(TestPlugin):
self.assertTrue(isinstance(actual, fake.FakeThread))
p.stop_threads()
@mock.patch.object(fake.FakeRegexCommandPlugin, "process")
@mock.patch.object(fake.FakeRegexCommandPlugin, 'process')
def test_regex_base_not_called(self, mock_process):
CONF.callsign = fake.FAKE_TO_CALLSIGN
p = fake.FakeRegexCommandPlugin()
packet = fake.fake_packet(message="a")
packet = fake.fake_packet(message='a')
expected = None
actual = p.filter(packet)
self.assertEqual(expected, actual)
mock_process.assert_not_called()
packet = fake.fake_packet(tocall="notMe", message="f")
packet = fake.fake_packet(tocall='notMe', message='f')
expected = None
actual = p.filter(packet)
self.assertEqual(expected, actual)
@ -165,11 +185,11 @@ class TestPluginBase(TestPlugin):
self.assertEqual(expected, actual)
mock_process.assert_not_called()
@mock.patch.object(fake.FakeRegexCommandPlugin, "process")
@mock.patch.object(fake.FakeRegexCommandPlugin, 'process')
def test_regex_base_assert_called(self, mock_process):
CONF.callsign = fake.FAKE_TO_CALLSIGN
p = fake.FakeRegexCommandPlugin()
packet = fake.fake_packet(message="f")
packet = fake.fake_packet(message='f')
p.filter(packet)
mock_process.assert_called_once()
@ -177,22 +197,22 @@ class TestPluginBase(TestPlugin):
CONF.callsign = fake.FAKE_TO_CALLSIGN
p = fake.FakeRegexCommandPlugin()
packet = fake.fake_packet(message="f")
packet = fake.fake_packet(message='f')
expected = fake.FAKE_MESSAGE_TEXT
actual = p.filter(packet)
self.assertEqual(expected, actual)
packet = fake.fake_packet(message="F")
packet = fake.fake_packet(message='F')
expected = fake.FAKE_MESSAGE_TEXT
actual = p.filter(packet)
self.assertEqual(expected, actual)
packet = fake.fake_packet(message="fake")
packet = fake.fake_packet(message='fake')
expected = fake.FAKE_MESSAGE_TEXT
actual = p.filter(packet)
self.assertEqual(expected, actual)
packet = fake.fake_packet(message="FAKE")
packet = fake.fake_packet(message='FAKE')
expected = fake.FAKE_MESSAGE_TEXT
actual = p.filter(packet)
self.assertEqual(expected, actual)

View File

@ -25,7 +25,7 @@ deps =
pytest-cov
pytest
commands =
pytest -v --cov-report term-missing --cov=aprsd {posargs}
pytest -s -v --cov-report term-missing --cov=aprsd {posargs}
coverage: coverage report -m
coverage: coverage xml