Sneha Hanumanthaiah
Publications
Sovereign Identity (SSI) paradigm, the proposed solution ensures that participants retain full control over their digital identities and the assets they exchange, without relying on centralised intermediaries. The main contributions of this work are twofold. First, we propose a mechanism for the tokenization of asset offerings on the DLT, enabling each offering to be represented as a verifiable and uniquely identifiable digital entity. Each offering encapsulates essential metadata, including asset descriptions, access conditions, usage policies, and provider identity. This approach facilitates distributed, tamper-evident publication and efficient discovery of offerings, while ensuring that the responsibility for asset availability and delivery remains with the respective providers. Second, we define a modular and extensible architecture that integrates the tokenization process with decentralised identity management, fine-grained access control, and transaction verification. These capabilities are realised through the use of smart contracts, which enforce tamper-resistant, auditable, and policy-compliant agreements among participants, thereby supporting trustworthy and decentralised interactions across different participant domains. To demonstrate the applicability of the proposed approach, the framework is integrated into an IoT-based data marketplace scenario, highlighting its flexibility, interoperability, and alignment with the principles of decentralisation and self-sovereignty. By contributing a generalisable and extensible solution for decentralised offering management and governance, this work advances the technological foundations required for the realisation of secure and resilient data ecosystems. The remainder of this paper is structured as follows: Section II reviews related work on decentralised identity, data marketplaces, and the integration of DLTs and smart contracts into data governance frameworks. Section III outlines the architecture of the proposed framework. Section IV details the core processes and the overall operational behaviour of the system. Finally, Section V concludes with final remarks and future work. Abstract—The secure and trustworthy exchange of interoperable data assets is a key enabler for the development of IoT-based data spaces. This paper presents a decentralised framework for offering management and asset sharing within trusted and interoperable data spaces, leveraging Distributed Ledger Technologies (DLTs) and the Self-Sovereign Identity (SSI) paradigm to ensure transparency, integrity, and participant self-sovereignty. The proposed architecture integrates identity management and tamper-resistant smart contracts leveraging the IOTA Tangle to support decentralised offering discovery, access control and verifiable transactions. By addressing critical challenges related to trust, interoperability, and decentralisation, the framework contributes to the technological foundations required for scalable, secure, and resilient data ecosystems.
Deep learning models have achieved state-of-the-art performance across numerous domains, but their increasing size and computational complexity pose significant challenges for deployment in resource-constrained environments. Model pruning is a key technique to address this issue by reducing the number of model parameters. However, existing methods often present a trade-off between compression rate, computational speed-up, and performance preservation. This paper introduces a novel hybrid pruning methodology that strategically combines Weight Statistics Aware Pruning (WSAP)-based unstructured pruning with hardware-friendly structured channel pruning. Our approach first determines WSAP-driven pruning ratios using a heuristic based on the weights' Coefficient of Variation (CoV), allowing for more aggressive pruning of less critical layers. It then applies both fine-grained and channel-based pruning to maximize model compression while preserving accuracy. We demonstrate the effectiveness and generality of our method on two diverse tasks: Video Quality Assessment (VQA) with the DOVER-Mobile model and Time-Series Forecasting with the CrossFormer model. Our results show that the proposed hybrid method achieves a superior balance of efficiency and performance, reducing model parameters by up to 80% and FLOPs by over 50% while maintaining the accuracy of the original models. These improvements make our method well-suited for trustworthy and efficient deployment of deep learning models in shared and constrained environments.