Git Product home page Git Product logo

Comments (6)

lqvir avatar lqvir commented on August 31, 2024

We have updated the readme.md. Please use the following command to build this repo:

git clone https://github.com/lqvir/APSU.git
cd APSU
mkdir build
cd build
cmake .. -DLIBOTE_PATH=/your/path
cmake --build . 

We do not meet this Segmentation Fault Error when running a 2^10 vs. 2^18 sized PSU case. It seems to appear in LibOTe. Maybe try the default port? Could you present more information about how you built this repo? OwO

from apsu.

SaintJackson avatar SaintJackson commented on August 31, 2024

/> We have updated the readme.md. Please use the following command to build this repo:

git clone https://github.com/lqvir/APSU.git
cd APSU
mkdir build
cd build

/

cmake .. -DLIBOTE_PATH=/your/path
cmake --build .


We do not meet this Segmentation Fault Error when running a 2^10 vs. 2^18 sized PSU case. It seems to appear in LibOTe. Maybe try the default port? Could you present more information about how you built this repo? OwO

Thank you for the quick update!! I followed your new guidelines and rebuilt the project again. Everything worked fine (i.e. all dependencies, libOTe and APSU were built successfully) until running the python script "auto-test.py".

This time it does not report the "Segmentation fault error". But it raises other problems.

In your provided auto-test.py, the sender and receiver commands are as follows (I changed the paths for sender & receiver cli and the path of parameter json file to fit with my environment):

def DDHwork(thread,param):
    receiver_cmd = ["/home/APSU/build/bin/receiver_cli_ddh","-d db.csv",thread,"-p "+param]
    sender_cmd = ["/home/APSU/build/bin/sender_cli_ddh","-q query.csv",thread,"-p "+param]
    print(receiver_cmd)
    print(sender_cmd)
    outfileS = subprocess.Popen(sender_cmd,stdout=subprocess.PIPE,stderr=subprocess.PIPE)
    outfileR = subprocess.Popen(receiver_cmd,stdout=subprocess.PIPE,stderr=subprocess.PIPE)
    outfileR.wait()

    outfileS.send_signal(signal.SIGINT)

    testcast=['0', '0']
    with open("recvfile"+testcast[0]+testcast[1],"a+") as fp:
            for i in outfileR.stdout.readlines():
                fp.write(i.decode())
    with open("sendfile"+testcast[0]+testcast[1],"a+") as fp:
            for i in outfileS.stdout.readlines():
                fp.write(i.decode())

The parameter I chose follows 16M-1024.json:

def Test1():
    param = '/home/APSU/parameters/16M-1024.json'
    receiver_size = pow(2,18)
    sender_size = pow(2,10)
    intersection_size = 256

    prepare_data(receiver_size,sender_size,intersection_size,item_bc)

    DDHwork(thread_c[0],param)

Then the output recvrfile's log:

INFO  04:09:03:741.624: /home/APSU/cli/receiver/receiver.cpp150
INFO  04:09:03:746.408: Setting thread count to 1
INFO  04:09:03:774.004: PSUParams have false-positive probability 2^(-53.2564) per receiver item
INFO  04:09:03:963.937: Start inserting 262144 items in ReceiverDB
INFO  04:09:04:017.970: Found 262144 new items to insert in ReceiverDB
INFO  04:09:04:086.072: /home/APSU/receiver/apsu/receiver_db.cpp158
INFO  04:09:04:086.212: /home/APSU/receiver/apsu/receiver_db.cpp187
INFO  04:09:04:646.406: /home/APSU/receiver/apsu/receiver_db.cpp190
baseOT time560.453
oprf time352.868
INFO  04:09:04:999.567: outputs_as_items489
INFO  04:09:05:134.122: data_with_indices785975
INFO  04:09:05:148.757: Launching 1 insert-or-assign worker tasks
ERROR 04:09:05:203.807: Failed to create ReceiverDB: cast failed
ERROR 04:09:05:228.546: Failed to create ReceiverDB: terminating

sendfile's log:

INFO  04:09:03:739.159: /home/APSU/cli/sender/sender.cpp126
INFO  04:09:03:746.237: /home/APSU/cli/sender/sender.cpp128
INFO  04:09:03:746.369: Connecting to tcp://localhost:1212
INFO  04:09:03:748.354: Successfully connected to tcp://localhost:1212
INFO  04:09:03:748.372: Sending parameter request
INFO  04:09:03:774.693: PSUParams have false-positive probability 2^(-53.2564) per receiver item
INFO  04:09:03:774.702: Received valid parameters
INFO  04:09:03:774.714: Setting thread count to 1
INFO  04:09:03:809.514: Sending APSU query
INFO  04:09:03:809.831: Creating encrypted query for 1024 items
INFO  04:09:03:810.261: /home/APSU/sender/apsu/utils.cpp17
baseOT time953.193

From the logs, BaseOT and OPRF worked properly : )

Unfortunately, 1) the receiver process failed to create ReceiverDB. 2) I found no clue about how to output the "union.csv", though it's defined in "auto-test.py".

Look forward to your reply. Thank you!

from apsu.

lqvir avatar lqvir commented on August 31, 2024

The error message cast failed is quite vague, making it challenging to pinpoint the exact location of the issue. It doesn’t appear to be originating from our code. You might consider adding an option-l debug to output more detailed log information when executing receiver_cli_ddh. Additionally, upgrading your GCC could be beneficial, although I can’t guarantee it.
Regarding union.csv, once the PSU concludes, receiver_cli_ddh will get the complement and save it to union.csv.

from apsu.

SaintJackson avatar SaintJackson commented on August 31, 2024

Thank you for the suggestion, and the reply to the expected output!

I tried an improved version of GCC and g++ version 10.3, and an upgraded version of clang to the latest. Then I rebuilt the project at another Linux Ubuntu 1804 server, it came with the same error logs.

Though I tried with debug mode, it was still challenging for me to pinpoint errors of created CSVReader::DBData or the ReceiverDB.

ubuntu@cloud:/home/APSU/tools$ /home/APSU/build/bin/receiver_cli_ddh -d db.csv -t 1 -p /home/APSU/parameters/16M-1024.json -l debug --port 1212
INFO  10:56:21:345.300: /home/APSU/cli/receiver/receiver.cpp150
INFO  10:56:28:495.801: Setting thread count to 1
INFO  10:56:28:550.270: PSUParams have false-positive probability 2^(-53.2564) per receiver item
INFO  10:56:28:765.514: Start inserting 262144 items in ReceiverDB
DEBUG 10:56:28:765.591: Start computing OPRF hashes for 262144 items
DEBUG 10:56:28:768.374: Finished computing OPRF hashes for 262144 items
INFO  10:56:28:828.282: Found 262144 new items to insert in ReceiverDB
DEBUG 10:56:28:828.321: Start preprocessing 262144 unlabeled items
INFO  10:56:28:916.175: /home/APSU/receiver/apsu/receiver_db.cpp158
INFO  10:56:28:919.447: /home/APSU/receiver/apsu/receiver_db.cpp187
INFO  10:56:29:417.865: /home/APSU/receiver/apsu/receiver_db.cpp190
baseOT time501.79
oprf time408.57
INFO  10:56:29:826.714: outputs_as_items531
DEBUG 10:56:29:963.527: Finished preprocessing 262144 unlabeled items
INFO  10:56:29:963.562: data_with_indices785965
INFO  10:56:29:977.850: Launching 1 insert-or-assign worker tasks
DEBUG 10:56:29:978.101: Insert-or-Assign worker for bundle index 0; mode of operation: inserting new
ERROR 10:56:30:056.336: Failed to create ReceiverDB: cast failed
ERROR 10:56:30:082.413: Failed to create ReceiverDB: terminating

Still, please provide more details about the dependencies' versions that you use. The C++ LibOTe boost I applied is versioned 1.77. I am not sure if this version is compatible with APSU.

Thank you!

from apsu.

lqvir avatar lqvir commented on August 31, 2024

I’m thankful for your support and understanding. Now we have clarified that this error should be thrown by seal::safe_cast at

// APSU/receiver/apsu/receiver_db.cpp: line 384

int32_t new_largest_bin_size =
    bundle_it->multi_insert_dry_run(data, bin_idx);
     	
    // Add this line here(line 386) to generate more info
    // If receiver_cil_ddh doesn't output this info, the cast failed error happens 
    // in func multi_insert_dry_run()
    // Otherwise, we could check whether the new_largest_bin_size causes this error
  APSU_LOG_INFO("new_largest_bin_size" << new_largest_bin_size); 

  // Check if inserting would violate the max bin size constraint
  if (new_largest_bin_size > 0 &&
    safe_cast<size_t>(new_largest_bin_size) < max_bin_size) {}

Pls add APSU_LOG_INFO("new_largest_bin_size" << new_largest_bin_size); at line 384. And you could add more log info e.g. info about bin_idx. We can discover more about the conditions that led to the error.

We compile and build this project with GNU GCC 11.4.0 at Linux Ubuntu 22.04. And the Microsoft Seal we applied is version 4.1.1. Sorry, we didn't test it on Clang.

Thank you!

from apsu.

SaintJackson avatar SaintJackson commented on August 31, 2024

After switching to Ubuntu 22.04, with apt source: jammy, boost 1.77, GCC/g++ 11.04, the(2^18, 2^10, 16M-1024.json) case worked.

Thank you very much!

from apsu.

Related Issues (4)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.