Git Product home page Git Product logo

qfieldcloud's Introduction

QFieldCloud

QFieldCloud is a Django based service designed to synchronize projects and data between QGIS (+ QFieldSync plugin) and QField.

QFieldCloud allows seamless synchronization of your field data with your spatial infrastructure with change tracking, team management and online-offline work capabilities in QField.

Hosted solution

If you're interested in quickly getting up and running, we suggest subscribing to the version hosted by OPENGIS.ch at https://qfield.cloud. This is also the instance that is integrated by default into QField. QFieldCloud logo

Documentation

QField and QFieldCloud documentation is deployed here.

Development

Clone the repository

Clone the repository and all its submodules:

git clone --recurse-submodules [email protected]:opengisch/qfieldcloud.git

To fetch upstream development, don't forget to update the submodules too:

git pull --recurse-submodules  && git submodule update --recursive

Launch a local instance

Copy the .env.example into .env file and configure it to your desire with a good editor:

cp .env.example .env
emacs .env

Make sure the host's firewall allows port 8009, required by the minio service. Failing to meet this requirement is likely to result in the service being unable to start.

To build development images and run the containers:

docker compose up -d --build

It will read the docker-compose*.yml files specified in the COMPOSE_FILE variable and start a django built-in server at http://localhost:8011.

Run the django database migrations.

docker compose exec app python manage.py migrate

And collect the static files (CSS, JS etc):

docker compose run app python manage.py collectstatic --noinput

You can check if everything seems to work correctly using the status command:

docker compose exec app python manage.py status

Now you can get started by adding the first user that would also be a super user:

docker compose run app python manage.py createsuperuser --username super_user --email [email protected]

Tests

To run all the unit and functional tests (on a throwaway test database and a throwaway test storage directory):

export COMPOSE_FILE=docker-compose.yml:docker-compose.override.local.yml:docker-compose.override.test.yml
# (Re-)build the app service to install necessary test utilities (requirements_test.txt)
docker compose up -d --build
docker compose run app python manage.py migrate
docker compose run app python manage.py test --keepdb

To run only a test module (e.g. test_permission.py)

docker compose run app python manage.py test --keepdb qfieldcloud.core.tests.test_permission

Debugging

This section gives examples for VSCode, please adapt to your IDE)

If you are using the provided docker-compose.override.local.yml, then debugpy is automatically installed and configured for use.

Add the following to your IDE to connect (example given for VSCode's .vscode/launch.json, triggered with F5):

{
    "version": "0.2.0",
    "configurations": [
        {
            "name": "QFC debug app",
            "type": "python",
            "request": "attach",
            "justMyCode": false,
            "connect": {"host": "localhost", "port": 5678},
            "pathMappings": [
                {
                    "localRoot": "${workspaceFolder}/docker-app/qfieldcloud",
                    "remoteRoot": "/usr/src/app/qfieldcloud"
                },
                {
                    "localRoot": "${workspaceFolder}/docker-app/site-packages",
                    "remoteRoot": "/usr/local/lib/python3.10/site-packages/"
                },
            ],
        },
        {
            "name": "QFC debug worker_wrapper",
            "type": "python",
            "request": "attach",
            "justMyCode": false,
            "connect": {"host": "localhost", "port": 5679},
            "pathMappings": [
                {
                    "localRoot": "${workspaceFolder}/docker-app/qfieldcloud",
                    "remoteRoot": "/usr/src/app/qfieldcloud"
                },
                {
                    "localRoot": "${workspaceFolder}/docker-app/site-packages",
                    "remoteRoot": "/usr/local/lib/python3.10/site-packages/"
                },
            ],
        }
    ]
}

To add breakpoints in vendor modules installed via pip or apt, you need a copy of their source code. The easiest way to achieve that is do actual copy of them:

docker compose cp app:/usr/local/lib/python3.10/site-packages/ docker-app/site-packages

The configuration for the vendor modules is the second object in the example pathMappings above, as well as setting justMyCode to false.

Do not forget to copy the site packages every time any of the requirements.txt files are changed!

If you are not using docker-compose.override.local.yml, there are other options.

You can debug interactively by adding this snippet anywhere in the code.

import debugpy
debugpy.listen(("0.0.0.0", 5680))
print("debugpy waiting for debugger... ๐Ÿ›")
debugpy.wait_for_client()  # optional

Or alternativley, prefix your commands with python -m debugpy --listen 0.0.0.0:5680 --wait-for-client.

docker compose run -p 5680:5680 app python -m debugpy --listen 0.0.0.0:5680 --wait-for-client manage.py test
docker compose run -p 5681:5681 worker_wrapper python -m debugpy --listen 0.0.0.0:5681 --wait-for-client manage.py test

Note if you run tests using the docker-compose.test.yml configuration, the app and worker-wrapper containers expose ports 5680 and 5681 respectively.

Add root certificate

QFieldCloud will automatically generate a certificate and it's root certificate in ./config/nginx/certs. However, you need to trust the root certificate first, so other programs (e.g. curl) can create secure connection to the local QFieldCloud instance.

On Debian/Ubuntu, copy the root certificate to the directory with trusted certificates. Note the extension has been changed to .crt:

sudo cp ./conf/nginx/certs/rootCA.pem /usr/local/share/ca-certificates/rootCA.crt

Trust the newly added certificate:

sudo update-ca-certificates

Connecting with curl should return no errors: curl https://localhost:8002/

Remove the root certificate

If you want to remove or change the root certificate, you need to remove the root certificate file and refresh the list of certificates:

sudo rm /usr/local/share/ca-certificates/rootCA.crt
sudo update-ca-certificates --fresh

Now connecting with curl should fail with a similar error:

$ curl https://localhost:8002/

curl: (60) SSL certificate problem: unable to get local issuer certificate
More details here: https://curl.haxx.se/docs/sslcerts.html

curl failed to verify the legitimacy of the server and therefore could not
establish a secure connection to it. To learn more about this situation and
how to fix it, please visit the web page mentioned above.

Code style

Code style done with precommit

pip install pre-commit
# install pre-commit hook
pre-commit install

Deployment

Launch a server instance

Copy the .env.example into .env file and configure it to your desire with a good editor

cp .env.example .env
emacs .env

Do not forget to set DEBUG=0 and to adapt COMPOSE_FILE to not load local development configurations.

Create the directory for qfieldcloud logs and supervisor socket file

mkdir /var/local/qfieldcloud

Run and build the docker containers

docker compose up -d --build

Run the django database migrations

docker compose exec app python manage.py migrate

Create or renew a certificate using Let's Encrypt

If you are running the server on a server with a public domain, you can install Let's Encrypt certificate by running the following command:

./scripts/init_letsencrypt.sh

The same command can also be used to update an expired certificate.

Note you may want to change the LETSENCRYPT_EMAIL, LETSENCRYPT_RSA_KEY_SIZE and LETSENCRYPT_STAGING variables.

Infrastructure

Based on this example https://testdriven.io/blog/dockerizing-django-with-postgres-gunicorn-and-nginx/

Ports

service port configuration local development production
nginx http 80 WEB_HTTP_PORT โœ… โœ… โœ…
nginx https 443 WEB_HTTPS_PORT โœ… โœ… โœ…
django http 8011 DJANGO_DEV_PORT โœ… โŒ โŒ
postgres 5433 HOST_POSTGRES_PORT โœ… โœ… โœ…
memcached 11211 MEMCACHED_PORT โœ… โŒ โŒ
geodb 5432 HOST_POSTGRES_PORT โœ… โœ… โŒ
minio API 8009 MINIO_API_PORT โœ… โŒ โŒ
minio browser 8010 MINIO_BROWSER_PORT โœ… โŒ โŒ
smtp web 8012 SMTP4DEV_WEB_PORT โœ… โŒ โŒ
smtp 25 SMTP4DEV_SMTP_PORT โœ… โŒ โŒ
imap 143 SMTP4DEV_IMAP_PORT โœ… โŒ โŒ

Logs

Docker logs are managed by docker in the default way. To read the logs:

docker compose logs

For great nginx logs, use:

QFC_JQ='[.ts, .ip, (.method + " " + (.status|tostring) + " " + (.resp_time|tostring) + "s"), .uri, "I " + (.request_length|tostring) + " O " + (.resp_body_size|tostring), "C " + (.upstream_connect_time|tostring) + "s", "H " + (.upstream_header_time|tostring) + "s", "R " + (.upstream_response_time|tostring) + "s", .user_agent] | @tsv'
docker compose logs nginx -f --no-log-prefix | grep ':"nginx"' | jq -r $QFC_JQ

Geodb

The geodb (database for the users projects data) is installed on separated machines (db1.qfield.cloud, db2.qfield.cloud, db3โ€ฆ) and they are load balanced and available through the db.qfield.cloud address.

There is a template database called template_postgis that is used to create the databases for the users. The template db has the following extensions installed:

  • fuzzystrmatch
  • plpgsql
  • postgis
  • postgistigergeocoder
  • postgistopology

Storage

You can use either the integrated minio object storage, or use an external provider (e. g. S3) with versioning enabled. Check the corresponding STORAGE_* environment variables for more info.

Collaboration

Contributions welcome!

Any PR including the [WIP] should be:

  • able to be checked-out without breaking the stack;
  • the specific feature being developed/modified should be testable locally (does not mean it should work correctly).

Resources

qfieldcloud's People

Contributors

arkanoid87 avatar dependabot[bot] avatar faebebin avatar gustry avatar itsakifa avatar m-kuhn avatar marioba avatar mbernasocchi avatar mbi avatar miili avatar nirvn avatar olivierdalang avatar philipp-baumann avatar robert197 avatar signedav avatar stcz avatar suricactus avatar why-not-try-calmer avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

qfieldcloud's Issues

Error pushing changes

I'm getting these error, when trying to push changes from qfield.

Failed to upload delta file, reason:
Error transferring
https://app.qfield.cloud/api/v1/deltas/1864d519-8fd9-425c-abc5-4ab97c9fd617/

  • server replied: Internal Server Error
    [QF/]

[QF/]
[HTTP/500]
https://app.qfield.cloud/api/v1/deltas/1864d519-8fd9-425c-abc5-4ab97c9fd617/
Server Error.{"code":["unknown_error"],"message":["QFieldcloud Unknown
Error"]}
{"code":["unknown_error"],"message":["QFieldcloud Unknown Error"]}
Error transferring
https://app.qfield.cloud/api/v1/deltas/1864d519-8fd9-425c-abc5-4ab97c9fd617/

  • server replied: Internal Server Error

Adding members to teams is failing

Adding a member to a team fails. The user I try to add as a team member is not only a member of the organization but also the owner of the organization.

cannot upload local QfieldSync directory

I have a bug when trying to convert a project into cloud project

When I want to convert currently open project to cloud project, if I use the button with three dots, then select one folder in my computer, I have a message saying that "The entered path does not contain a QGIS project file yet" then I click "ok" and it send me back to the folder selection (see image below)

cannot upload local QfieldSync directory

However, if I write the folder path by hand instead of using the three dots button, then the button "create" become enabled.

manual entry of the directory

Unable to synchronize, changes stay in "pending" status

A few days ago, the synchronization of our two QField cloud projects stopped working. The projects are owned by one account and shared to three others, which are each logged in on one Android device. App versions are 2.0.9 on two and 2.0.14 on the third device.

The failure to synchronize makes itself evident by the synchronization running endlessly on the Android devices when there are local changes (hangs at the stage "QFieldCloud is applying the latest uploaded changes. This might take some time, please hold tight."). Alternatively, on QField 2.0.14, the synchronization sporadically also aborted with a message "Data failed to reach QField cloud" (without further information).

Pushing the changes instead of synchronizing the project results in a success message. Subsequent synchronization attempts after the push also terminate without error. However, rather than updating the data set, the synchronization in this case results in the previously pushed local changes to go missing and thus to data loss.

On QFieldCloud's web interface, the changes we attempted to synchronize do appear in the "Changes" view with status "Pending". Trying to apply them manually yields the message "x change(s) will be applied". However, even after several hours, they are still not applied. The corresponding Jobs are failing without any output (#313).

I hope this description provides a suitable starting point for an analysis of the problem. Of course, we are happy to provide additional details and the IDs of/access to the projects in question, in case this is helpful.

Cancel user invitation

I invited one colleague to join Qfieldcloud but he is not interested (unbelievable but true!).

Is it possible to cancel the invitation request to give the chance to someone else?
I did not find any related button for this action.

Support basemap creation on QFieldCloud

Describe the issue

Even though I check to box to create a basemap for my project, it is not created.
The project transfers without issue to the cloud and to the device, but no basemap anywhere. Am I doing something wrong?

Reproduction steps

Steps to reproduce the behavior:

  1. Go to QFieldSync plugin preferences
  2. Check Basemap option and fill parameters
  3. Export project to the cloud and sync it to a device

Expected behavior

A basemap should be created.

Observed behavior

No basemap anywhere...

Screenshots and GIFs

NA

[Please also attach additional files if a specific project/dataset is useful to investigate the problem.]

Desktop (please complete the following information)

  • OS: Windows
  • QGIS Version 3.22.8
  • QFieldSync Version 4.1.1

Mobile (please complete the following information)

  • Device: Samsung Galaxy A52
  • OS: Android 12
  • QField version: 2.2.2 (81e273)

Additional information

  • Problem started happening recently, didn't happen in an older version of QField: I'm not sure, only started using QFieldCloud more intensively
  • Problem can be reliably reproduced, doesn't happen randomly: Yes

[If the problem happened with QFieldCloud, please add your username and project name.]
username: OBVYamaska
Project: VTT_Benev

Duplicate Deltas

Hi, we're seeing duplicate_deltafile server error when trying to synchronise or push commits from QField and is preventing projects from syncing and uploading new data to QFieldCloud.

This behaviour is happening across multiple devices (but not all devices pushing data to the project on QFieldCloud).

On QFieldCloud the project has all conflicts resolved (i.e. status is project synced).

Is there a way to handle this from within QField or to regenerate a deltafile id on the device? Or, is there a way to remove the deltafile with the clashing id from QFieldCloud to allow data on devices to sync?

Relatedly, is there a better option than adb backup -noapk xxxx for retrieving data from QFieldCloud projects that are prevented from syncing as a back up in cases like this?

Please let me know if I can provide anymore info or if this should be moved to Discussions. Thanks!

Screenshot_2021-11-23-19-35-23-64_2e601d870d31f598a7096b1e53961eb8~2

Run the orchestrator container as non-root

As discussed in #19, the orchestrator container currently runs as root to be able to talk to the docker daemon.

It would probably be worth having a less privileged user just for that purpose.

Error server while applying pending changes

Hello,
I pushed about a 100 modifications from qfield, and when I go on my project through the browser, their status is error. Whan I try to apply the changes, no matter how much changes I try to apply at a time, after pressing "Yes, apply pending changes ", I get a Server Error (500). How could I get the work I did back ?
Alternatively, is it possible to acces the changeson my phone after I pushed them ?
Thanks a lot !

Issue with downloading project to device

When I try to download a qgisfieldcloud project I get the following errors.

1.Not packaged correctly
2. Invalid data provider

Project is fine is a cable downloaded project using QField sync.
.
Data is in 28355
Screenshot_2022-03-12-22-11-33-356


Log from admin console

<<<::12e3312f-1de9-4e80-b54a-3ed9bcca8ef4 Download Project Directory
INFO:root:Downloading file "projects/473e9e36-21b0-4085-8954-51015f680343/files/Base_Data.gpkg", size: 225402880 bytes, md5sum: "57d62855a93e679778c0e68736a0d380-27" 
INFO:root:Downloading file "projects/473e9e36-21b0-4085-8954-51015f680343/files/Community_Mapping_qfield.qgs", size: 1126958 bytes, md5sum: "51b4085209b459bd246295214c3506ee" 
INFO:root:Downloading file "projects/473e9e36-21b0-4085-8954-51015f680343/files/Conservation_Species_Vernacular.csv", size: 7986 bytes, md5sum: "bc03b28c398fc98805114dab5882b9c1" 
INFO:root:Downloading file "projects/473e9e36-21b0-4085-8954-51015f680343/files/Team.csv", size: 83 bytes, md5sum: "10c05c35e7e5b110fcb582cd19f57477" 
::>>>::12e3312f-1de9-4e80-b54a-3ed9bcca8ef4 2
::<<<::c260485e-ac08-4a66-8e5c-4b4d4d8eaa50 Project Validity Check
INFO:root:Check QGIS project file validity...
::>>>::c260485e-ac08-4a66-8e5c-4b4d4d8eaa50 2
::<<<::f5b1a6ab-44d8-4373-bab8-ed6b4a9a197d Opening Check
INFO:root:Open QGIS project file...
INFO:QGIS_STDERR:Starting QGIS app version 32203 (exported)...
WARNING:QGIS_STDERR:QStandardPaths: XDG_RUNTIME_DIR points to non-existing path '/run/user/0', please create it with 0700 permissions.
WARNING:QGIS_MSGLOG:Cannot open Z:/My Drive/Mangoesmapping/Spatial Projects/2022/002_TRC_Community_Mapping/Working/QField/Base_Data.gpkg.()
WARNING:QGIS_MSGLOG:Cannot open Z:/My Drive

WARNING:QGIS_MSGLOG: * Z:/My Drive/Mangoesmapping/Spatial Projects/2022/002_TRC_Community_Mapping/Working/Data_Editable.gpkg|layername=Record_a_place
WARNING:QGIS_MSGLOG: * Z:/My Drive/Mangoesmapping/Spatial Projects/2022/002_TRC_Community_Mapping/Working/Data_Editable.gpkg|layername=Record_an_area
WARNING:QGIS_MSGLOG: * Z:/My Drive/Mangoesmapping/Spatial Projects/2022/002_TRC_Community_Mapping/Working/Team.csv
WARNING:QGIS_MSGLOG: * Z:/My Drive/Mangoesmapping/Spatial Projects/2022/002_TRC_Community_Mapping/Working/QField/Base_Data.gpkg|layername=Watercourse
WARNING:QGIS_MSGLOG: * Z:/My Drive/Mangoesmapping/Spatial Projects/2022/002_TRC_Community_Mapping/Working/Base_Data.gpkg|layername=Roads and Tracks
::>>>::3d3d347f-a6bf-464f-b5df-0b3185c0a108 2
INFO:QGIS_STDERR:Stopping QGIS app...
Feedback pre:
{

minio unhealthy - local docker setup

Hi and thanks for this great project!

I want to test it on a local instance and I can't make it work, minio container is unhealthy.
I checked the logs and here are the results:

qfieldcloud-minio-1  | API: SYSTEM()
qfieldcloud-minio-1  | Time: 10:00:18 UTC 09/21/2022
qfieldcloud-minio-1  | DeploymentID: ad081a54-4d80-4428-b884-1cf0d5ae34e8
qfieldcloud-minio-1  | Error: unknown minor metadata version (*errors.errorString)
qfieldcloud-minio-1  |        4: cmd/xl-storage.go:1971:cmd.(*xlStorage).RenameData()
qfieldcloud-minio-1  |        3: cmd/xl-storage-disk-id-check.go:420:cmd.(*xlStorageDiskIDCheck).RenameData()
qfieldcloud-minio-1  |        2: cmd/erasure-object.go:530:cmd.renameData.func1()
qfieldcloud-minio-1  |        1: internal/sync/errgroup/errgroup.go:123:errgroup.(*Group).Go.func1()
qfieldcloud-minio-1  |
qfieldcloud-minio-1  | API: SYSTEM()
qfieldcloud-minio-1  | Time: 10:00:18 UTC 09/21/2022
qfieldcloud-minio-1  | DeploymentID: ad081a54-4d80-4428-b884-1cf0d5ae34e8
qfieldcloud-minio-1  | Error: unknown minor metadata version (*errors.errorString)
qfieldcloud-minio-1  |        4: cmd/xl-storage.go:1971:cmd.(*xlStorage).RenameData()
qfieldcloud-minio-1  |        3: cmd/xl-storage-disk-id-check.go:420:cmd.(*xlStorageDiskIDCheck).RenameData()
qfieldcloud-minio-1  |        2: cmd/erasure-object.go:530:cmd.renameData.func1()
qfieldcloud-minio-1  |        1: internal/sync/errgroup/errgroup.go:123:errgroup.(*Group).Go.func1()
qfieldcloud-minio-1  |
qfieldcloud-minio-1  | API: SYSTEM()
qfieldcloud-minio-1  | Time: 10:00:18 UTC 09/21/2022
qfieldcloud-minio-1  | DeploymentID: ad081a54-4d80-4428-b884-1cf0d5ae34e8
qfieldcloud-minio-1  | Error: unknown minor metadata version (*errors.errorString)
qfieldcloud-minio-1  |        4: cmd/xl-storage.go:1971:cmd.(*xlStorage).RenameData()
qfieldcloud-minio-1  |        3: cmd/xl-storage-disk-id-check.go:420:cmd.(*xlStorageDiskIDCheck).RenameData()
qfieldcloud-minio-1  |        2: cmd/erasure-object.go:530:cmd.renameData.func1()
qfieldcloud-minio-1  |        1: internal/sync/errgroup/errgroup.go:123:errgroup.(*Group).Go.func1()
qfieldcloud-minio-1  |
qfieldcloud-minio-1  | API: SYSTEM()
qfieldcloud-minio-1  | Time: 10:00:18 UTC 09/21/2022
qfieldcloud-minio-1  | DeploymentID: ad081a54-4d80-4428-b884-1cf0d5ae34e8
qfieldcloud-minio-1  | Error: unknown minor metadata version (*errors.errorString)
qfieldcloud-minio-1  |        4: cmd/xl-storage.go:1971:cmd.(*xlStorage).RenameData()
qfieldcloud-minio-1  |        3: cmd/xl-storage-disk-id-check.go:420:cmd.(*xlStorageDiskIDCheck).RenameData()
qfieldcloud-minio-1  |        2: cmd/erasure-object.go:530:cmd.renameData.func1()
qfieldcloud-minio-1  |        1: internal/sync/errgroup/errgroup.go:123:errgroup.(*Group).Go.func1()
qfieldcloud-minio-1  |
qfieldcloud-minio-1  | API: SYSTEM()
qfieldcloud-minio-1  | Time: 10:00:18 UTC 09/21/2022
qfieldcloud-minio-1  | DeploymentID: ad081a54-4d80-4428-b884-1cf0d5ae34e8
qfieldcloud-minio-1  | Error: unknown minor metadata version (*errors.errorString)
qfieldcloud-minio-1  |        4: cmd/xl-storage.go:1971:cmd.(*xlStorage).RenameData()
qfieldcloud-minio-1  |        3: cmd/xl-storage-disk-id-check.go:420:cmd.(*xlStorageDiskIDCheck).RenameData()
qfieldcloud-minio-1  |        2: cmd/erasure-object.go:530:cmd.renameData.func1()
qfieldcloud-minio-1  |        1: internal/sync/errgroup/errgroup.go:123:errgroup.(*Group).Go.func1()
qfieldcloud-minio-1  |
qfieldcloud-minio-1  | API: SYSTEM()
qfieldcloud-minio-1  | Time: 10:00:18 UTC 09/21/2022
qfieldcloud-minio-1  | DeploymentID: ad081a54-4d80-4428-b884-1cf0d5ae34e8
qfieldcloud-minio-1  | Error: unknown minor metadata version (*errors.errorString)
qfieldcloud-minio-1  |        4: cmd/xl-storage.go:1971:cmd.(*xlStorage).RenameData()
qfieldcloud-minio-1  |        3: cmd/xl-storage-disk-id-check.go:420:cmd.(*xlStorageDiskIDCheck).RenameData()
qfieldcloud-minio-1  |        2: cmd/erasure-object.go:530:cmd.renameData.func1()
qfieldcloud-minio-1  |        1: internal/sync/errgroup/errgroup.go:123:errgroup.(*Group).Go.func1()
qfieldcloud-minio-1  |
qfieldcloud-minio-1  | API: SYSTEM()
qfieldcloud-minio-1  | Time: 10:00:18 UTC 09/21/2022
qfieldcloud-minio-1  | DeploymentID: ad081a54-4d80-4428-b884-1cf0d5ae34e8
qfieldcloud-minio-1  | Error: unknown minor metadata version (*errors.errorString)
qfieldcloud-minio-1  |        4: cmd/xl-storage.go:1971:cmd.(*xlStorage).RenameData()
qfieldcloud-minio-1  |        3: cmd/xl-storage-disk-id-check.go:420:cmd.(*xlStorageDiskIDCheck).RenameData()
qfieldcloud-minio-1  |        2: cmd/erasure-object.go:530:cmd.renameData.func1()
qfieldcloud-minio-1  |        1: internal/sync/errgroup/errgroup.go:123:errgroup.(*Group).Go.func1()
qfieldcloud-minio-1  |
qfieldcloud-minio-1  | API: SYSTEM()
qfieldcloud-minio-1  | Time: 10:00:18 UTC 09/21/2022
qfieldcloud-minio-1  | DeploymentID: ad081a54-4d80-4428-b884-1cf0d5ae34e8
qfieldcloud-minio-1  | Error: unknown minor metadata version (*errors.errorString)
qfieldcloud-minio-1  |        4: cmd/xl-storage.go:1971:cmd.(*xlStorage).RenameData()
qfieldcloud-minio-1  |        3: cmd/xl-storage-disk-id-check.go:420:cmd.(*xlStorageDiskIDCheck).RenameData()
qfieldcloud-minio-1  |        2: cmd/erasure-object.go:530:cmd.renameData.func1()
qfieldcloud-minio-1  |        1: internal/sync/errgroup/errgroup.go:123:errgroup.(*Group).Go.func1()
qfieldcloud-minio-1  |
qfieldcloud-minio-1  | API: SYSTEM()
qfieldcloud-minio-1  | Time: 10:00:18 UTC 09/21/2022
qfieldcloud-minio-1  | DeploymentID: ad081a54-4d80-4428-b884-1cf0d5ae34e8
qfieldcloud-minio-1  | Error: Write failed. Insufficient number of disks online (*errors.errorString)
qfieldcloud-minio-1  |        5: cmd/erasure-object.go:827:cmd.erasureObjects.putObject()
qfieldcloud-minio-1  |        4: cmd/erasure-object.go:590:cmd.erasureObjects.PutObject()
qfieldcloud-minio-1  |        3: cmd/erasure-sets.go:893:cmd.(*erasureSets).PutObject()
qfieldcloud-minio-1  |        2: cmd/erasure-server-pool.go:774:cmd.(*erasureServerPools).PutObject()
qfieldcloud-minio-1  |        1: cmd/data-scanner.go:163:cmd.runDataScanner()
qfieldcloud-minio-1  |
qfieldcloud-minio-1  | API: SYSTEM()
qfieldcloud-minio-1  | Time: 10:00:18 UTC 09/21/2022
qfieldcloud-minio-1  | DeploymentID: ad081a54-4d80-4428-b884-1cf0d5ae34e8
qfieldcloud-minio-1  | Error: Write failed. Insufficient number of disks online (*errors.errorString)
qfieldcloud-minio-1  |        5: cmd/erasure-object.go:827:cmd.erasureObjects.putObject()
qfieldcloud-minio-1  |        4: cmd/erasure-object.go:590:cmd.erasureObjects.PutObject()
qfieldcloud-minio-1  |        3: cmd/erasure-sets.go:893:cmd.(*erasureSets).PutObject()
qfieldcloud-minio-1  |        2: cmd/erasure-server-pool.go:774:cmd.(*erasureServerPools).PutObject()
qfieldcloud-minio-1  |        1: cmd/data-usage.go:56:cmd.storeDataUsageInBackend()
qfieldcloud-minio-1  |
qfieldcloud-minio-1  | API: SYSTEM()
qfieldcloud-minio-1  | Time: 10:00:18 UTC 09/21/2022
qfieldcloud-minio-1  | DeploymentID: ad081a54-4d80-4428-b884-1cf0d5ae34e8
qfieldcloud-minio-1  | Error: Storage resources are insufficient for the write operation .minio.sys/buckets/.bloomcycle.bin (cmd.InsufficientWriteQuorum)
qfieldcloud-minio-1  |        1: cmd/data-scanner.go:165:cmd.runDataScanner()
qfieldcloud-minio-1  |
qfieldcloud-minio-1  | API: SYSTEM()
qfieldcloud-minio-1  | Time: 10:00:18 UTC 09/21/2022
qfieldcloud-minio-1  | DeploymentID: ad081a54-4d80-4428-b884-1cf0d5ae34e8
qfieldcloud-minio-1  | Error: Storage resources are insufficient for the write operation .minio.sys/buckets/.usage.json (cmd.InsufficientWriteQuorum)
qfieldcloud-minio-1  |        1: cmd/data-usage.go:58:cmd.storeDataUsageInBackend()

As it seems it might be a lack of storage issue, I checked the available space with df -h and I have free space left : /dev/nvme0n1p5 203G 85G 108G 44% /home (108G available)

I've follower the readme, populated the .env file as follow:

DEBUG=1
ENVIRONMENT=test #also tested qith "local"

QFIELDCLOUD_HOST=localhost
DJANGO_SETTINGS_MODULE=qfieldcloud.settings
DJANGO_ALLOWED_HOSTS=localhost 127.0.0.1 0.0.0.0 app 192.168.0.77

SECRET_KEY=mkS3KUAa7h!Dd2r7TG&wQAa7h!G&wQEmkS

[email protected]
LETSENCRYPT_RSA_KEY_SIZE=4096
# Set to 1 if you're testing your setup to avoid hitting request limits
LETSENCRYPT_STAGING=1

STORAGE_ACCESS_KEY_ID=minioadmin
STORAGE_SECRET_ACCESS_KEY=!G&wQEmkS3KUAa7h!Dd2r7TG&wQAa7h!G&wQEmkS
STORAGE_BUCKET_NAME=qfieldcloud-local
STORAGE_REGION_NAME=

# URL to the storage endpoint either minio, or external (e.g. S3).
# The URL must be reachable both from within docker and from the host, the default value is the `bridge` docker URL.
# Read more on https://docs.docker.com/network/network-tutorial-standalone/ .
# NOTE: to use minio on windows/mac, change the value to "http://host.docker.internal:8009"
# DEFAULT: http://172.17.0.1:8009
STORAGE_ENDPOINT_URL=http://172.17.0.1:8009

# Public port to the minio API endpoint. It must match the configured port in `STORAGE_ENDPOINT_URL`.
# NOTE: active only when minio is the configured as storage endpoint. Mostly for local development.
# DEFAULT: 8009
MINIO_API_PORT=8009

# Public port to the minio browser endpoint.
# NOTE: active only when minio is the configured as storage endpoint. Mostly for local development.
# DEFAULT: 8010
MINIO_BROWSER_PORT=8010

WEB_HTTP_PORT=8008
WEB_HTTPS_PORT=4443

POSTGRES_USER=qfieldcloud_db_admin
POSTGRES_PASSWORD=3shJDd2r7Twwkehb
POSTGRES_DB=qfieldcloud_db
POSTGRES_HOST=db
POSTGRES_PORT=5432
# "prefer" OR "require" most of the times
POSTGRES_SSLMODE=prefer
HOST_POSTGRES_PORT=5433

GEODB_HOST=geodb
GEODB_PORT=5432
GEODB_USER=postgres
GEODB_PASSWORD=KUAa7h!G&wQEmkS3
GEODB_DB=postgres

SENTRY_DSN=

REDIS_PASSWORD=ffoB#4^rrP4kvpo74y$!Y8ZdHvPLQNta3wZyj8MU
REDIS_PORT=6379

LOG_DIRECTORY=/tmp
TMP_DIRECTORY=/tmp

ACCOUNT_EMAIL_VERIFICATION=optional
EMAIL_HOST=smtp4dev
EMAIL_USE_TLS=False
EMAIL_USE_SSL=False
EMAIL_PORT=25
EMAIL_HOST_USER=user
EMAIL_HOST_PASSWORD=password
DEFAULT_FROM_EMAIL=webmaster@localhost

QFIELDCLOUD_DEFAULT_NETWORK=qfieldcloud_default

# Admin URI. Requires slash in the end. Please use something that is hard to guess.
QFIELDCLOUD_ADMIN_URI=niawdmin/
# password for niaw_admin : DkAggvQMUnBMJ7c2wKeeQnKCRVeu9^6$UoM%TMS*37vb#Jm2s

# QFieldCloud URL used within the worker as configuration for qfieldcloud-sdk
QFIELDCLOUD_WORKER_QFIELDCLOUD_URL=http://app:8000/api/v1/

# The Django development port. Not used in production.
# DEFAULT: 8011
DJANGO_DEV_PORT=8011

GUNICORN_TIMEOUT_S=300
GUNICORN_MAX_REQUESTS=300
GUNICORN_WORKERS=3
GUNICORN_THREADS=3

# Not used in production.
# DEFAULT: 8012
SMTP4DEV_WEB_PORT=8012

# Not used in production.
# DEFAULT: 25
SMTP4DEV_SMTP_PORT=25

# Not used in production.
# DEFAULT: 143
SMTP4DEV_IMAP_PORT=143

COMPOSE_PROJECT_NAME=qfieldcloud
COMPOSE_FILE=docker-compose.yml:docker-compose.override.local.yml
# required for making COMPOSE_FILE above cross-platform (do not change)
COMPOSE_PATH_SEPARATOR=:

I changed the docker-compose command to fit my setup (docker compose, not docker-compose): docker compose up -d --build (without -)
I'm running docker on an Ubuntu server 22.04.1.
uname -a
Linux nuc 5.15.0-46-generic #49-Ubuntu SMP Thu Aug 4 18:03:25 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux
Here are my docker information:

Docker Compose version v2.6.0

Client: Docker Engine - Community
 Version:           20.10.17
 API version:       1.41
 Go version:        go1.17.11
 Git commit:        100c701
 Built:             Mon Jun  6 23:02:46 2022
 OS/Arch:           linux/amd64
 Context:           default
 Experimental:      true

Server: Docker Engine - Community
 Engine:
  Version:          20.10.17
  API version:      1.41 (minimum version 1.12)
  Go version:       go1.17.11
  Git commit:       a89b842
  Built:            Mon Jun  6 23:00:51 2022
  OS/Arch:          linux/amd64
  Experimental:     false
 containerd:
  Version:          1.6.7
  GitCommit:        0197261a30bf81f1ee8e6a4dd2dea0ef95d67ccb
 runc:
  Version:          1.1.3
  GitCommit:        v1.1.3-0-g6724737
 docker-init:
  Version:          0.19.0
  GitCommit:        de40ad0

Do you know what is going on? Can I test things out to get it working?

PostGIS layer and user name in pg_service secret

I have a project with PostGIS layer (db pg_hba.conf configured for a given user and database) and pg_service secret file where I set db host, db name, user name (not root of course) and passw.

Upon sync to qField I am getting connection error (found on qField cloud job details):
FATAL: no pg_hba.conf entry for host "185.203.114.168", user "root", database "mydb_test", SSL off
WARNING:QGIS_MSGLOG:1 unavailable layer(s) found:

I tried to double check & rewrite pg_service secret, but getting the same - "... user "root", ..."

So - what could be wrong?

brgds

Jobs: Timeout error - 60 s too short - Changes are not applied

Jobs on Eifel-Large-N/Eifel_LargeN-Deployment are running into a timeout error.

It looks like the deltas are too many.

Timeout error! The job failed to finish within 60 seconds!

See job e.g. 50f8b8b2

60 seconds timeout is too short for the running job to finish.

Virtual layers with QField Cloud

### Short description of problem

I am trying to configure a QGIS project with virtual layers but it does not synchronizes correctly to Qfield Cloud.

### Reproduction Steps:

I am working on a project for fieldwork in which a point feature (portal addresses) have N related records (households) on a related table. So that, when the user clicks on a portal address point, the N households show up.

image

Both features, with/without geometry, are loaded to Qfield Cloud as Geopackage files.

I have also created a virtual layer, based on a SQL query*, in order to symbolize the points in green (once all of its related households have been inspected by the fieldworker).

SQL query:

image

### Expected behavior:

I have tried the same workflow (with a simple case) and it works fine in the cloud and the app. The points update to color green when I collect data for the 100% of its related housholds records.

### Observed behavior:

In the project I want to implement, I think the issue might be in the query I mentioned before. In QGIS desktop it takes quite a lot to load every zoom in/out refresh. And the project, when I upload it to Qfield Cloud, stays in red color and cannot be downloaded to the app.

image

So, I have checked that virtual layers work with Qfield Cloud. Any ideas on how to improve the SQL query to achieve the same goal?

Thank you!

## Additional information:

  • Problem started happening recently, didn't happen in an older version of QField: [No]
  • Problem can be reliably reproduced, doesn't happen randomly: [Yes]
  • Problem happens with all files and projects, not only some files or projects: [No]

Network request timed out - Error / maximum Size of .GPKG file ?

Hi,
Until i could easily sync or upload our Project with the size of about 12MB to Qfieldcloud.

Today i added a giant layer with over 180.000 lines (we get this private-property-border-layer from our local government in Austria)
Adding this gib layer bloats the .gpkg to 145MB and the following Error occures while syncing.

Unbenannt2

With the result of freezing Qgis at40% ...thats why i can actually not read the Protocol, cause i have to brutally-restart the crashed Qgis

Unbenannt

is there a known maximum size of .gpkg ? for cloud use right now ?

Cause if i export the same Project via cable to our tablet the project works surprisingly fine in Qfield ->even with the giant layer.
So the Problem must be a kind of uploadlimit to the Cloud, rather than ageneral filesize-regulation in Qfield.

greets and Thanks

Gernot

No Information about failed Jobs on Webpage

I have some failed jobs in my qfield cloud under the jobs tab.
Jobs

On succeeded jobs I'm able to click on the Job ID to receive more details.

On failed jobs I'm not able to click on the Job ID, and so I'm not able to receive details what went wrong.

To investigate the cause, it would be great to receive more details about the failures.

Thanks for your work.

Collaborators and Teams: UI Obscurred

Hi all,

the experience adding user and teams could be improved by better auto suggestions. Also the input is obsurred by firefox' auto completion View saved Logins. The <input ... name=username .../> is the problem.

image

Synchronizing from local to cloud fails on windows computer but works on Linux mint

Strange issue, synchronizing (from local to cloud) from a windows 10 Enterprise pc is a problem since yesterday evening. But it works from cloud to local.

Synchronizing from a linux Mint 20 xfce 64-bit works in both ways.
I thought, perhaps it is an issue because of some geometry changes I made on the windows side, so I synchronized back from linux Mint to cloud and then on the windows pc from cloud to local (I lost the geometry changes). This worked out well. Then making on change on windows side again en syncing back from local to cloud, syncing is aborted again.

Each time I got a time-out message, without any difference trying to sync all changes (*.qgs and *.gpkg files) at once or one for one.

image
image
image
image

Changes not applied

Hi everyone,

I do not understand why some changes of my field workers are not applied. They push changes once a day and I have seen on the website that some of those changes have the status "Not applied".
image
If I try to change the status to "Applied" manually, it does not work either. The action starts but, after a while, it says "Not applied" again (Server error 500).

Why this might happen?

I attach the raw json of one of those not applied changes.
raw_json.txt

Thanks,

Gabriel

Postgis in qfieldcloud

How to use postgis in qfieldclod. Is there a tutorial?
It is announced but is it functional?

Project changed into QField cloud project when exporting on the cloud

Hello,

Would it be possible to avoid QField cloud changing the project into a QField cloud project when exporting the project on the cloud ?

If the current project hasn't been saved, all the changes are lost.
If changes are made after the export, nothing can be saved on the current project sice the current project isn't anymore displayed (it is saved somewhere else). It brought be to loose hours of work since I try to make QField cloud work on my projects.

Before export:
Sans titre2

After export:
Sans titre

Any change on the project after this export is easily lost.

Documentation doesn't succeed

Thanks for the great project. I want to take a closer look at it and followed the instructions on the README.

I created a new Ubuntu 20.04 VM, installed docker-compose and checkout the master branch of this repository and followed the instructions.

The first error I receive is:

invalid sslmode value: "prefer # "prefer" OR "require" most of the times"

I solved this by removing the comment in .env.

Further I get:

ERROR: The Compose file './docker-compose.override.yml' is invalid because:
services.createbuckets.depends_on contains an invalid type, it should be an array

Referring to the documentation, the depends_on condition is not supported anymore in docker-compose 3.x. I fixed this with:

--- docker-compose.override.local.yml   2022-01-03 18:36:06.136000000 +0000
+++ docker-compose.override.yml 2022-01-03 18:43:10.536000000 +0000
@@ -102,8 +102,7 @@
   createbuckets:
     image: minio/mc
     depends_on:
-      minio:
-        condition: service_healthy
+      - minio
     entrypoint: >
       /bin/sh -c "
       /usr/bin/mc config host add myminio ${STORAGE_ENDPOINT_URL} ${STORAGE_ACCESS_KEY_ID} ${STORAGE_SECRET_ACCESS_KEY};

After these changes the containers are built and running.

Then I'm able to access the WebUI and it seems working.

But if I'm running the tests (got this from github workflows):

docker-compose run app python manage.py test -v2 qfieldcloud

They will fail:

...
FAIL: test_change_and_delete_pushed_only_features (qfieldcloud.core.tests.test_delta.QfcTestCase)
FAIL: test_list_all_deltas_and_list_deltas_by_deltafile (qfieldcloud.core.tests.test_delta.QfcTestCase)
FAIL: test_non_spatial_delta (qfieldcloud.core.tests.test_delta.QfcTestCase)
FAIL: test_push_apply_delta_file (qfieldcloud.core.tests.test_delta.QfcTestCase)
FAIL: test_push_apply_delta_file_conflicts_overwrite_false (qfieldcloud.core.tests.test_delta.QfcTestCase)
FAIL: test_push_apply_delta_file_conflicts_overwrite_true (qfieldcloud.core.tests.test_delta.QfcTestCase)
FAIL: test_push_apply_delta_file_twice (qfieldcloud.core.tests.test_delta.QfcTestCase)
FAIL: test_push_apply_delta_file_with_error (qfieldcloud.core.tests.test_delta.QfcTestCase)
FAIL: test_push_list_deltas (qfieldcloud.core.tests.test_delta.QfcTestCase)
FAIL: test_push_list_multidelta (qfieldcloud.core.tests.test_delta.QfcTestCase)
FAIL: test_download_file_for_qfield (qfieldcloud.core.tests.test_packages.QfcTestCase)
FAIL: test_download_project_with_broken_layer_datasources (qfieldcloud.core.tests.test_packages.QfcTestCase)
FAIL: test_downloaded_file_has_canvas_name (qfieldcloud.core.tests.test_packages.QfcTestCase)
FAIL: test_filename_with_whitespace (qfieldcloud.core.tests.test_packages.QfcTestCase)
FAIL: test_list_files_for_qfield (qfieldcloud.core.tests.test_packages.QfcTestCase)
FAIL: test_download_file_for_qfield (qfieldcloud.core.tests.test_qfield_file.QfcTestCase)
FAIL: test_download_project_with_broken_layer_datasources (qfieldcloud.core.tests.test_qfield_file.QfcTestCas
e)
FAIL: test_downloaded_file_has_canvas_name (qfieldcloud.core.tests.test_qfield_file.QfcTestCase)
FAIL: test_filename_with_whitespace (qfieldcloud.core.tests.test_qfield_file.QfcTestCase)
FAIL: test_list_files_for_qfield (qfieldcloud.core.tests.test_qfield_file.QfcTestCase)
FAILED (failures=20)
...

I attached the complete test.log

I tried to sync a project within the Test instance (exposed in a Local Network) but this also didn't work.

I didn't spend that much time up to now investigating the Errors, because i'm thinking this should not happen at a new setup. Or is it maybe related to my changes in the other files?

Thanks for your help.

corrupted project, from the start

Hi everybody

When I download my Project from QFieldCloud, QField Dev says "the locally stored project has been corrupted". Even if I reset the project, it ist still corrupted. If I copy the project manualy to QField (not Dev), it works.
Screenshot_20210712-172511_QField_Dev

The message log of QFIELD Dev says:
Screenshot_20210712-172437_QField_Dev

It seems, that QFIELD Dev wants to write a file named "delta.json" to the network drive of my computer, where the original project is stored.

I renamed the project in QField Cloud to "project for github issue" and made it publicly available.
For me it seems to be a bug, maybe it is a misconfiguration of the project.

Can't package wms

Qfield won't let me load wms from qfield cloud. The wms is working when opening the project in QGIS on my computer. When trying to load it gives the following error:
Screenshot_20220516-214000

Removing and adding files from QFieldCloud Projects causes error

Hi Everybody,
I have been setting up a QFieldCloud Project on my mac (also use pc in case I have to manually package and upload project to my lenovo android tablets). Its for an archaeological survey where I have 107 polygone shp files for each archaeological site representing the survey grid. I first saved the project under a specific name, added all the files and styled them in the way I want them to appear on the cloud and on QField on my devices. However when I remove/ re-add or add any of the shp file layers in the QGIS app on my computer when I have packaged it for QFieldCloud it will not update the project on the cloud. When I then try to re-open the project on my Lenovo tablet it receive a package error and its all the files that I have removed and readded to the project on my computer or newly added files. The message I get on QField on my tablet is "QFieldCloud had trouble packaging your project" and further below it says for each layer that did not load again after I had removed and reloaded it (or added anew after creating the cloud project) on my computer and synchronized the project to the cloud "Packed layer 'USP_001_Survey Grid' is not valid. Error code invalid_dataprovider, error message"/tmp/tmp294gmyu3/The Umma Survey Project/Field Work Data/ April 2022 Season/USP_001/USP_001 [...] missing." The only workaround is that I delete the current QFieldCloud Project add all the layers I want, package it and then reload the new QFieldCloud Project on my tablet. But it appears as if I cannot remove and reload files from the project itself nor add new files to it and synchronize the project.
Any advice? Would appreciate any help.
Thank you

Layer ID: Cannot find layer id ''

We have errornous commits with empty layer id. I suspect it happens when one edits on different map theme.

The changes are not applied. How can this be resolved?

Creating geodbs fails on hosted database systems such as AWS RDS

Thanks everybody for releasing QField Cloud to the public! The documentation was excellent and I got an instance up and running in no time with your instructions :)

However, long story short: I am running into difficulties with the architecture when running on AWS. The reason is that I thought of using an AWS hosted PostgreSQL RDS, and it seems that in those cases, the so-called AWS "superuser" actually has no rights to create databases for other users unless they share a role: http://asheiduk.de/post/setup-pg-on-rds/

This results in Django returning

InsufficientPrivilege at /admin/core/geodb/add/
must be member of role "test_user"

when trying to create a new geodb with a new user in Django admin.

I know this is a limitation of AWS databases, and not really QField Cloud's fault.

So I was wondering, will I need to set up a custom database cluster if I want to test out QFieldCloud?

QFieldCloud Jobs : very long time to Process QGIS Project File (SVG files)

Hello,

The project I created refers to 385 SVG files in the project file folder. (SVG file size between 1K and 7K)

I am currently working with QGIS 3.22.4 LTR and QFieldSync 4.0.0.

The SVG project files were uploaded successfully to the cloud, however the project is in busy status for a long time
image

Checking the jobs with the WebUI, I could notice that it takes 25 seconds per SVG file (pending -> started > finished>
image

So for 385 files, it should take 385 /2 = around 3 hours before the project status change from busy to ready.
Could you find out why it takes so long, and/or fix this issue.

Sentry pollutes local dev traces

On local dev, Sentry pollutes the errors logs quite a lot (see example)

Isn't there a way to have sentry setup on local dev (like we do with smpt4dev), or otherwise can't we completely disable it on local dev ?

orchestrator_1   | --- Logging error ---
orchestrator_1   | Traceback (most recent call last):
orchestrator_1   |   File "/usr/local/lib/python3.8/site-packages/rq/worker.py", line 1008, in perform_job
orchestrator_1   |     rv = job.perform()
orchestrator_1   |   File "/usr/local/lib/python3.8/site-packages/rq/job.py", line 706, in perform
orchestrator_1   |     self._result = self._execute()
orchestrator_1   |   File "/usr/local/lib/python3.8/site-packages/rq/job.py", line 729, in _execute
orchestrator_1   |     result = self.func(*self.args, **self.kwargs)
orchestrator_1   |   File "/usr/local/lib/python3.8/site-packages/rq/job.py", line 219, in func
orchestrator_1   |     return import_attribute(self.func_name)
orchestrator_1   |   File "/usr/local/lib/python3.8/site-packages/rq/utils.py", line 130, in import_attribute
orchestrator_1   |     return getattr(module, attribute)
orchestrator_1   | AttributeError: module 'orchestrator' has no attribute 'apply_deltas'
orchestrator_1   |
orchestrator_1   | During handling of the above exception, another exception occurred:
orchestrator_1   |
orchestrator_1   | Traceback (most recent call last):
orchestrator_1   |   File "/usr/src/app/orchestrator/worker.py", line 31, in handle_exception
orchestrator_1   |     job_row = get_job_row(job.id)
orchestrator_1   |   File "/usr/src/app/orchestrator/db_utils.py", line 77, in get_job_row
orchestrator_1   |     cur.execute(
orchestrator_1   |   File "/usr/local/lib/python3.8/site-packages/psycopg2/extras.py", line 146, in execute
orchestrator_1   |     return super(DictCursor, self).execute(query, vars)
orchestrator_1   | psycopg2.errors.UndefinedTable: relation "core_job" does not exist
orchestrator_1   | LINE 3:             FROM core_job j
orchestrator_1   |                          ^
orchestrator_1   |
orchestrator_1   |
orchestrator_1   | During handling of the above exception, another exception occurred:
orchestrator_1   |
orchestrator_1   | Traceback (most recent call last):
orchestrator_1   |   File "/usr/local/lib/python3.8/logging/__init__.py", line 1081, in emit
orchestrator_1   |     msg = self.format(record)
orchestrator_1   |   File "/usr/local/lib/python3.8/logging/__init__.py", line 925, in format
orchestrator_1   |     return fmt.format(record)
orchestrator_1   |   File "/usr/local/lib/python3.8/site-packages/json_log_formatter/__init__.py", line 62, in format
orchestrator_1   |     message = record.getMessage()
orchestrator_1   |   File "/usr/local/lib/python3.8/logging/__init__.py", line 369, in getMessage
orchestrator_1   |     msg = msg % self.args
orchestrator_1   | TypeError: not all arguments converted during string formatting
orchestrator_1   | Call stack:
orchestrator_1   |   File "/usr/local/lib/python3.8/runpy.py", line 192, in _run_module_as_main
orchestrator_1   |     return _run_code(code, main_globals, None,
orchestrator_1   |   File "/usr/local/lib/python3.8/runpy.py", line 85, in _run_code
orchestrator_1   |     exec(code, run_globals)
orchestrator_1   |   File "/usr/src/app/orchestrator/__main__.py", line 1, in <module>
orchestrator_1   |     from . import worker  # noqa
orchestrator_1   |   File "<frozen importlib._bootstrap>", line 1042, in _handle_fromlist
orchestrator_1   |   File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
orchestrator_1   |   File "<frozen importlib._bootstrap>", line 991, in _find_and_load
orchestrator_1   |   File "<frozen importlib._bootstrap>", line 975, in _find_and_load_unlocked
orchestrator_1   |   File "<frozen importlib._bootstrap>", line 671, in _load_unlocked
orchestrator_1   |   File "<frozen importlib._bootstrap_external>", line 783, in exec_module
orchestrator_1   |   File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
orchestrator_1   |   File "/usr/src/app/orchestrator/worker.py", line 56, in <module>
orchestrator_1   |     w.work()
orchestrator_1   |   File "/usr/local/lib/python3.8/site-packages/rq/worker.py", line 590, in work
orchestrator_1   |     self.execute_job(job, queue)
orchestrator_1   |   File "/usr/local/lib/python3.8/site-packages/rq/worker.py", line 838, in execute_job
orchestrator_1   |     self.fork_work_horse(job, queue)
orchestrator_1   |   File "/usr/local/lib/python3.8/site-packages/rq/worker.py", line 755, in fork_work_horse
orchestrator_1   |     self.main_work_horse(job, queue)
orchestrator_1   |   File "/usr/local/lib/python3.8/site-packages/rq/worker.py", line 852, in main_work_horse
orchestrator_1   |     self.perform_job(job, queue)
orchestrator_1   |   File "/usr/local/lib/python3.8/site-packages/sentry_sdk/integrations/rq.py", line 74, in sentry_patched_perform_job 
orchestrator_1   |     rv = old_perform_job(self, job, *args, **kwargs)
orchestrator_1   |   File "/usr/local/lib/python3.8/site-packages/rq/worker.py", line 1024, in perform_job
orchestrator_1   |     self.handle_exception(job, *exc_info)
orchestrator_1   |   File "/usr/local/lib/python3.8/site-packages/sentry_sdk/integrations/rq.py", line 93, in sentry_patched_handle_exception
orchestrator_1   |     return old_handle_exception(self, job, *exc_info, **kwargs)
orchestrator_1   |   File "/usr/local/lib/python3.8/site-packages/rq/worker.py", line 1059, in handle_exception
orchestrator_1   |     fallthrough = handler(job, *exc_info)
orchestrator_1   |   File "/usr/src/app/orchestrator/worker.py", line 43, in handle_exception
orchestrator_1   |     logging.critical("Failed to handle exception: ", str(err))
orchestrator_1   |   File "/usr/local/lib/python3.8/logging/__init__.py", line 2022, in critical
orchestrator_1   |     root.critical(msg, *args, **kwargs)
orchestrator_1   |   File "/usr/local/lib/python3.8/logging/__init__.py", line 1481, in critical
orchestrator_1   |     self._log(CRITICAL, msg, args, **kwargs)
orchestrator_1   |   File "/usr/local/lib/python3.8/logging/__init__.py", line 1577, in _log
orchestrator_1   |     self.handle(record)
orchestrator_1   |   File "/usr/local/lib/python3.8/logging/__init__.py", line 1587, in handle
orchestrator_1   |     self.callHandlers(record)
orchestrator_1   |   File "/usr/local/lib/python3.8/site-packages/sentry_sdk/integrations/logging.py", line 86, in sentry_patched_callhandlers       
orchestrator_1   |     return old_callhandlers(self, record)
orchestrator_1   | Message: 'Failed to handle exception: '
orchestrator_1   | Arguments: ('relation "core_job" does not exist\nLINE 3:             FROM core_job j\n                         ^\n',)

instead of more readable

orchestrator_1   | Traceback (most recent call last):
orchestrator_1   |   File "/usr/local/lib/python3.8/site-packages/rq/worker.py", line 1008, in perform_job
orchestrator_1   |     rv = job.perform()
orchestrator_1   |   File "/usr/local/lib/python3.8/site-packages/rq/job.py", line 706, in perform
orchestrator_1   |     self._result = self._execute()
orchestrator_1   |   File "/usr/local/lib/python3.8/site-packages/rq/job.py", line 729, in _execute
orchestrator_1   |     result = self.func(*self.args, **self.kwargs)
orchestrator_1   |   File "/usr/local/lib/python3.8/site-packages/rq/job.py", line 219, in func
orchestrator_1   |     return import_attribute(self.func_name)
orchestrator_1   |   File "/usr/local/lib/python3.8/site-packages/rq/utils.py", line 130, in import_attribute
orchestrator_1   |     return getattr(module, attribute)
orchestrator_1   | AttributeError: module 'orchestrator' has no attribute 'apply_deltas'
orchestrator_1   |
orchestrator_1   | During handling of the above exception, another exception occurred:
orchestrator_1   |
orchestrator_1   | Traceback (most recent call last):
orchestrator_1   |   File "/usr/src/app/orchestrator/worker.py", line 31, in handle_exception
orchestrator_1   |     job_row = get_job_row(job.id)
orchestrator_1   |   File "/usr/src/app/orchestrator/db_utils.py", line 77, in get_job_row
orchestrator_1   |     cur.execute(
orchestrator_1   |   File "/usr/local/lib/python3.8/site-packages/psycopg2/extras.py", line 146, in execute
orchestrator_1   |     return super(DictCursor, self).execute(query, vars)
orchestrator_1   | psycopg2.errors.UndefinedTable: relation "core_job" does not exist
orchestrator_1   | LINE 3:             FROM core_job j

What config-fields need to be set?

First of all, I'm very happy and appreciate the fact, that people/organizations like you exist and create something open for the community! I would like to setup QFieldCloud on my server to check out, if I can use it together with QField for mapping e.g. fauna. My problem is, that I'm not quite sure how to setup the config-file .env?

In the Readme.md you just wrote:

Copy the .env.example into .env file and configure it to your desire

But what does this exactly mean? Like:

  1. Do I need to setup a PostgreSQL-server, which will be used by the QfieldCloud or is there a predefined docker-image from you? Which I don't need to touch?
  2. It's the same about Redis? Do I need to setup Redis on the server or will it be handled by any docker-image?
  3. What ports do I need to open in the firewall?
  4. I think it's wise to change the predefined passwords in the file or will it brake any internal procedures?
  5. Do I need to setup the E-Mail config to try out QfieldCloud?

Furthermore: do I need to change something in the docker-compose.override.yml or any other config-file?

Greetings

Adding organization delete all projects related to the user

When we start on Qfieldcloud, when don't need organization, but if we add it later, it deletes all projects created by the main user.
When we try to access them from the plugin to the website, it says that they didn't exist.
Thank you and congratulations on this game-changer solution!

QField Cloud - Better error messages

Hello,

For any kind of project that I create on the cloud, I always have the same error : ""

When I try to download a project with one layer, it is working. So I guess there is something with my projects.
But with this message, I have no idea about what is wrong.

I create a cloud project for each layer, but the error is the same for all the geometric offline layers. And sometimes it works, sometimes it doesn't.
For the same project, sometimes I have an error message like the one on the picture, and sometimes I can download it.

Sans titre

Is there a way to have more information about this error, when it happen ?

Thank you.

How to increase the limit on number of collaborators

How do I increase the number of collaborators?
I have tried to register for 'Pro' version but it does not work as it says it is free till it is Beta version
What is the way out? I am trying to popularise qfield to college students in thousands.
So need to add collaborators

Symbolize a list of a related table in the app

I would like to know if there is any possibility to configure the QGIS project, or the app, in order to achieve the following workflow:

As the fieldworker edits the features, based on the list at the right part of the image (which points to single households in a building), it would be very useful that this text " Bloc: 0 - Escala: 1 - Planta: 1 Porta: 2" could be symbolized in colors depending on a given condition. For instance, the text (or the cell) becomes green, once the form for this record has been filled.

Now:

image

Expected behavior:

image

Thank you!

"Descriptions" instead of "Values" in attribute tables

Hi,
We are testing a QField project for bird censuses and I have notice that qfieldcloud saves "Descriptions" instead of "Values" in attribute tables. The original point feature has a form made in QGIS "Drag and Drop Designer"
I am not sure if this is a problem of QField Cloud or if it comes from original QGIS Forms edition or configuration.
I have found a similar issue in QGIS development: opengisch/QField#31, already fixed if I am not wrong, but not for qfieldcloud
image
image

Best wishes,
Eladio

Export / Download Fail when project has no features

On creating a "clean" data collection in QGIS (e.g. a GeoPackage with fields and a attributes form created but no features), it is possible to covert the project to a QFieldCloud project using the "Convert currently open project to cloud project (recommended)" OK. However, it is not possible to download this project to QField with app displaying a download error. The Jobs on QFieldCloud for the failed Export report:

WARNING:QGIS_STDERR:QStandardPaths: XDG_RUNTIME_DIR not set, defaulting to '/tmp/runtime-root'
INFO:__main__:Failed to obtain the project extent from project layers.

However, if I create a feature in QGIS and sync with QFieldCloud all works fine. There might have been a step in project creation which I've missed here.

@nirvn - I've added you to "simple-project" in QFieldCloud which demonstrates this in case you want to take a closer look.

Server not available?

Perhaps a temporary problem, but I can't reach the server (not with QField, not on desktop).

Server down?

Hi, I can't synchronize a project or reach/refresh the QFieldCloud website/login/dashboard. Perhaps some maintenance on the server?

Use layer outside on QFieldCloud projects (enhancement)

Use layer outside on QFieldCloud projects (enhancement)

Steps which explain the enhancement

Is it possible to use a common basemap for all QFieldCloud projects ?

Current and suggested behavior

In the documentation, I did not see the possibility to use a background outside the project files.
Except for using a project without QFieldCloud, you have implemented this functionality documentation here, is it possible to integrate this functionality for QFieldCloud?

Why would the enhancement be useful to most users

As before without QFieldCloud, I used a basemap (gpkg) of about 500mo for all projects, it was really useful.

QField Version: 2.1.4 - Bumblebee ๐Ÿ
QFieldCloud Version: 0.14.2

Thank you,
best regards

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.