General Practice

This page will introduce you how to create a model that will fits into the FLock process

The main usage of SDK:

  • Define the model standard. Therefore, when create model you need to define the following structure for FLock client to pick up the function and run.

    • init_dataset - define as a function that will call data preprocessing function

    • train - function that will run the training.

    • evaluate - function that will evaluate performance

    • aggregate - function that will aggregate all data

# import flock package
from flock_sdk import FlockSDK, FlockModel

# ensure you use it for the class
class example_model(FlockModel):

    def __init__(self):
        return
    
    # inheritant all functions
    def init_dataset(self, dataset_path: str) -> None:
        return
    
    def train(self, parameters: bytes) -> bytes:
        return
        
    def evaluate(self, parameters: bytes) -> bytes:
        return
    
    def aggregate(self, parameters_list: list[bytes]) -> bytes:
        return

if __name__ == "__main__":
    model = example_model(features, epochs=epochs, lr=lr)
    sdk = FlockSDK(model)
    sdk.run()

Pinata Script

The main usage of this Pinata script is to upload a file to IPFS (InterPlanetary File System) using the Pinata cloud service. The script defines a function to handle the file pinning and uses an API key and secret obtained from environment variables for authentication. When executed, it expects a file path as an argument and prints the response from the pinning operation to the console. If the required argument is not provided, it gives a usage message and exits.

import os
from dotenv import load_dotenv
from pinatapy import PinataPy

load_dotenv()

PINATA_API_KEY = os.getenv("PINATA_API_KEY")
PINATA_SECRET_API_KEY = os.getenv("PINATA_SECRET_API_KEY")

def pin_file_to_ipfs(path_to_file):
    pinata = PinataPy(PINATA_API_KEY, PINATA_SECRET_API_KEY)
    response = pinata.pin_file_to_ipfs(path_to_file, "/", False)
    return response

if __name__ == "__main__":
    import sys
    if len(sys.argv) != 2:
        print("Usage: python pinata_api.py <path_to_file>")
        sys.exit(1)

    path_to_file = sys.argv[1]
    response = pin_file_to_ipfs(path_to_file)
    print(response)

Shell script

This script is a script intended to compress the current directory into a temporary file, upload it to IPFS using the previously described Pinata Python script, and then clean up the temporary file. Here's a step-by-step summary of its function:

  1. It creates a temporary file to store the compressed directory contents.

  2. It compresses the current directory (all files and subdirectories) into an XZ compressed tarball and times the operation.

  3. It calls a Python script, pinata_api.py, to upload the compressed file to IPFS via the Pinata service.

  4. After uploading, it attempts to extract the IPFS hash of the uploaded file from the response using a Python one-liner, assuming the response variable is a JSON string with an 'IpfsHash' key.

  5. It prints the extracted IPFS hash, which represents the address of the uploaded content on IPFS.

  6. Finally, it removes the temporary compressed file to avoid leaving unnecessary files on the system.

Windows version

$ErrorActionPreference = "Stop"

# Create a temporary file
$OUTPUT_FILE = [System.IO.Path]::GetTempFileName()

# Measure the time taken to compress the current directory
Measure-Command { Compress-Archive -Path . -DestinationPath "$OUTPUT_FILE.xz" -CompressionLevel Optimal }

# Use the pinata_api.py script to pin the file to IPFS
Write-Host "Uploading the compressed image to IPFS.."
$response = python pinata_api.py "$OUTPUT_FILE.xz"

# Extract the IpfsHash from the response using Python
Write-Host "Extracting IpfsHash.."
$ipfs_hash = python -c "import json; data = $response; print(data.get('IpfsHash', ''))"
Write-Host "Model definition IPFS hash: $ipfs_hash"

# Clean up the temporary output file
Remove-Item "$OUTPUT_FILE.xz"

Linux version

#!/bin/bash
set -e

OUTPUT_FILE=`mktemp`

time (tar -czf "${OUTPUT_FILE}.xz" .)

# Use the pinata_api.py script to pin the file to IPFS
echo "Uploading the compressed image to IPFS.."
response=$(python pinata_api.py "${OUTPUT_FILE}.xz")

# Extract the IpfsHash from the response using Python
echo "Extracting IpfsHash.."
ipfs_hash=$(python -c "import json; data = $response; print(data.get('IpfsHash', ''))")
echo "Model definition IPFS hash: $ipfs_hash"

# Clean up the temporary output file
rm "${OUTPUT_FILE}.xz"

Last updated