autopkgtests broken in hirsute - error: int32 scalar cannot be indexed with .
Affects | Status | Importance | Assigned to | Milestone | |
---|---|---|---|---|---|
Octave |
Unknown
|
Unknown
|
|||
octave (Ubuntu) |
Fix Released
|
Undecided
|
Unassigned | ||
octave-parallel (Ubuntu) |
Invalid
|
Undecided
|
Unassigned |
Bug Description
This is failing reliably on autopkgtes infra:
- initially vs 3.1.3
https:/
- Trigger octave/
https:/
- all-proposed:
https:/
- this actually fails before the new octave, with the intorduction of the new octave-parallel
https:/
Containers with hirsute and hirsute-proposed work fine.
The following is the same in debian-sid, hirsute and hirsute-proposed:
root@h:
Checking package...
Checking m files ...
[inst/pararrayf
...
[parcellfun]
PASSES 1 out of 1 test
Summary: 11 tests, 11 passed, 0 known failures, 0 skipped
Local VM based autopkgtests all work.
They work for hirsute and hirsute-proposed and and a selection of just
octave,
Even the old tests against octave-parallel 3.1.3 failed.
So it is not just "coming with 4.x" of octave-parallel.
On LP they failed as well:
https:/
Checking slightly deeper showed that the initial fail vs 3.1.3 was
NOT the same, it was a badpkg
https:/
But reruns of that apear to be the same as we see in other times
https:/
All failing ones of them come down to:
!!!!! test failed
int32 scalar cannot be indexed with {
In comparison this seems fine on debci, there all 4.0.0-2 runs LGTM
https:/
https:/
Another moving piece in this puzzle is dh-ocatve which was updated on
30th December 2020. There isn't an old version of that in hirsute anymore,
it migrated.
dh-octave | 0.7.6 | groovy/universe | source, all
dh-octave | 1.0.3 | hirsute/universe | source, all
But the changelog isn't too suspicious.
Not sure if it is important - but the order on the tests differ.
Each run seem to have a random combination, but that is true for good and
bad runs.
Just to be clear on the error, it is of this type:
https:/
octave:4> data.foo
error: scalar cannot be indexed with .
octave:5> data = int32(1234)
data = 1234
octave:6> data.foo
error: int32 scalar cannot be indexed with .
But without a reproducer it is hard where it might be from.
Maybe language dependent as local repros tend to get soem remainders
of local lang.
I'm out of good ideas, but will continue before taking a step back and masking the test.
I've run it with debug enabled without much gain - not even when comparing that to a debug enabled good run..
This was done in a PPA (https:/
$ xvfb-run -a octave-cli --debug --verbose --no-history --no-init-file --no-window-system inst/parcellfun.m
Further TODO's
- try the old dh-octave?
Related branches
- Iain Lane: Approve
-
Diff: 116 lines (+9/-9)9 files modifiedworker-config-production/worker-bos01-arm64.conf (+1/-1)
worker-config-production/worker-bos01-ppc64el.conf (+1/-1)
worker-config-production/worker-bos01-s390x.conf (+1/-1)
worker-config-production/worker-bos01.conf (+1/-1)
worker-config-production/worker-bos02-arm64.conf (+1/-1)
worker-config-production/worker-bos02-ppc64el.conf (+1/-1)
worker-config-production/worker-bos02-s390x.conf (+1/-1)
worker-config-production/worker-canonistack.conf (+1/-1)
worker-config-production/worker.conf (+1/-1)
tags: | added: update-excuse |
All proposed + PPA:
Locally: autopkgtest/ autopkgtest/ runner/ autopkgtest --no-built-binaries --apt-upgrade --apt-pocket= proposed --setup- commands= "add-apt- repository ppa:paelzer/ lp-1911400- octave- test-fail; apt update; apt -y upgrade" --shell octave- parallel_ 4.0.0-2ubuntu1~ ppa1.dsc -- qemu --ram-size=1536 --cpus 2 ~/work/ autopkgtest- hirsute- amd64.img
$ sudo ~/work/
=> this kept working
PPA: lp-1911400- octave- test-fails --release hirsute --showskip --showpass
$ lp-test-ppa ppa:paelzer/
=> This kept failing
I've isolated the logs of one of the tests (inst/pararrayf un.m) and diffed them.
Unfortunately not much insight. After a whopping 15013 lines that fully match it directly runs into the fail in the bad case.
The issue is very close to the end with just 20 lines following in the good case before the logs fully match again (except reports stating that one test failed).
The asserts of the test are so far away - it seems this is "on the way out" of the test and not directly associated to the assert.