Leverage Vertex AI for candidate classification and candidate scoring
To get started, please have a look at this highly recommended learning path to get hands-on lab experience for Vertex AI. In particular please focus on the following two tutorials:
Following your learnings, it will be great to start setting up a playground GCP Project which would need Vertex AI and Google Cloud Storage services
For playing around with the resume dataset:
- Please find the Kaggle Dataset here.
- Build auto-ml resume classification: Please refer to this sample hello-text project which walks you through e2e project setup, data import, auto-ml training, model performance evaluation and deployment steps.
- Get the Dataset imported it into the project using Vertex AI console > Dataset. In the import section please use the upload files from your computer option.
- For convenience, I have already cleaned the dataset which is ready to be imported into the Vertex AI project here.
- For building custom text embedding models for candidate scoring:
- Within the same project, please create a GCP Cloud storage bucket and upload the raw Kaggle dataset
- On the Vertex AI workbench, please create a managed notebook instance and launch the JupyterLab and you can either create your own Python3 notebook or import the sample one in this repo .
- If there are any libraries that are missing, please use:
!pip install library_nam
in the Jupyter notebook cell - Also please ensure the right path to your csv file in gcs is used while reading the data using pandas