protocol: assume the initial stroke point to be 0, 0, 0

Reproduced on the Spark, first bytes in a new file recording were
- file header: 0x62, 0x38, 0x62, 0x74
- stroke point: 0xbf, 0xff, 0xff, 0x4a, 0x14, 0x29, 0x31, 0x6a
- stroke delta: 0xa8, 0x02, 0x04, 0xc3,

The initial point thus has a delta pressure value 0x6a which must be added to
*something*. And zero is the most sensible *something* that I can think of.

Fixes #111

Signed-off-by: Peter Hutterer <peter.hutterer@who-t.net>
pull/191/head
Peter Hutterer 2019-08-23 10:21:09 +10:00 committed by Benjamin Tissoires
parent c466c7d431
commit 44d2018f1c
1 changed files with 4 additions and 3 deletions

View File

@ -1470,7 +1470,10 @@ class StrokeFile(object):
Stroke = namedtuple('Stroke', ['points'])
Point = namedtuple('Point', ['x', 'y', 'p'])
last_point = None # abs coords for most recent point
# The Spark can have a delta on the first point in a file. Let's
# default to 0, 0, 0 because I don't know what else could be
# sensible here.
last_point = Point(0, 0, 0) # abs coords for most recent point
last_delta = Point(0, 0, 0) # delta accumulates
strokes = [] # all strokes
@ -1550,8 +1553,6 @@ class StrokeFile(object):
# can process both the same way.
if packet_type == StrokeDataType.POINT:
packet = StrokePoint(data)
if last_point is None:
last_point = Point(packet.x, packet.y, packet.p)
else:
packet = StrokeDelta(data)