Comments (4)
Hi,
Can you confirm that this message is related to your dataset preparation not working? Seems like it could be just a warning that shouldn't prevent the script from executing (see #2761)
You can also try using:
tfds.core.utils.gcs_utils._is_gcs_disabled = True
os.environ['NO_GCE_CHECK'] = 'true'
To avoid the error message
Let us know if it works.
from datasets.
The error message you encountered suggests that TensorFlow is trying to authenticate with Google Cloud services to retrieve authentication tokens, but it's unable to do so because it's running in an environment where it can't access the necessary credentials.
Since you're working with a local dataset and don't need to interact with Google Cloud storage, you can disable the Google authentication by setting the environment variable GOOGLE_APPLICATION_CREDENTIALS
to an empty string before running your script.
Here's how you can modify your script to disable Google authentication:
import os
# Disable Google Cloud authentication
os.environ['GOOGLE_APPLICATION_CREDENTIALS'] = ''
# Now import the required modules
import glob
import numpy as np
import tensorflow as tf
import tensorflow_datasets as tfds
import tensorflow_hub as hub
# Your existing code follows...
Adding this code at the beginning of your script will prevent TensorFlow from attempting to authenticate with Google Cloud services, and it should resolve the error you encountered.
Additionally, make sure that the file paths you're providing in your script ('/home/universal_sentence_encoder'
) are correct and accessible within your Docker container environment.
Once you've made these modifications, try running your script again:
python language_table_use_dataset_builder.py
This should allow your script to build the dataset without encountering authentication errors. If you encounter any further issues, please let me know, and I'll be happy to assist you further.
from datasets.
The error message you encountered suggests that TensorFlow is trying to authenticate with Google Cloud services to retrieve authentication tokens, but it's unable to do so because it's running in an environment where it can't access the necessary credentials.
Since you're working with a local dataset and don't need to interact with Google Cloud storage, you can disable the Google authentication by setting the environment variable
GOOGLE_APPLICATION_CREDENTIALS
to an empty string before running your script.Here's how you can modify your script to disable Google authentication:
import os # Disable Google Cloud authentication os.environ['GOOGLE_APPLICATION_CREDENTIALS'] = '' # Now import the required modules import glob import numpy as np import tensorflow as tf import tensorflow_datasets as tfds import tensorflow_hub as hub # Your existing code follows...Adding this code at the beginning of your script will prevent TensorFlow from attempting to authenticate with Google Cloud services, and it should resolve the error you encountered.
Additionally, make sure that the file paths you're providing in your script (
'/home/universal_sentence_encoder'
) are correct and accessible within your Docker container environment.Once you've made these modifications, try running your script again:
python language_table_use_dataset_builder.py
This should allow your script to build the dataset without encountering authentication errors. If you encounter any further issues, please let me know, and I'll be happy to assist you further.
Thanks for your suggestion, but it seems that the issue is still remaining
from datasets.
Hi,
Can you confirm that this message is related to your dataset preparation not working? Seems like it could be just a warning that shouldn't prevent the script from executing (see #2761)
You can also try using:
tfds.core.utils.gcs_utils._is_gcs_disabled = True os.environ['NO_GCE_CHECK'] = 'true'
To avoid the error message
Let us know if it works.
it works, thanks
from datasets.
Related Issues (20)
- Load Sentiment140 failed with HTTP 404 HOT 1
- Can not load robotics dataset HOT 4
- TFBertModel: InvalidArgumentError.__init__() missing 2 required positional arguments: 'op' and 'message' HOT 1
- Streaming dataset construction or appending to an existing dataset HOT 1
- Please support prefetch with python datasets HOT 2
- gs' not implemented HOT 2
- tfds build failed HOT 2
- [data request] <dataset dengue> HOT 1
- [data request] <dataset educação superior no Brasil> HOT 1
- [data request] smallnorb HOT 2
- Multi-threaded compression? HOT 1
- checksum updated HOT 1
- Exception ignored in: <function AtomicFunction.__del__ at 0x71926a728940> HOT 13
- canot load EMNIST dataset HOT 8
- HTTP Error 301 HOT 1
- Example serializer doesn't properly raise exception HOT 2
- [data request] <emnist>
- Error when processing speech_commands dataset HOT 1
- [data request] <poker>
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from datasets.