Adam Dziedzic

My goal is to empower users with data driven decisions. I am a postdoctoral researcher at the Vector Institute and the University of Toronto, advised by Prof. Nicolas Papernot. I earned my PhD at the University of Chicago, where I was advised by Prof. Sanjay Krishnan and worked on the Band-Limited convolutional neural networks, the DeepLens project as well as the out-of-distribution robustness of pre-trained transformers. My previous research was focused on data loading and migration between diverse database systems within the framework of the BigDAWG project. I obtained my Bachelor's and Master's degrees from Warsaw University of Technology in Poland. I was also studying at DTU (Technical University of Denmark) and carried out research on databases in the DIAS group at EPFL, Switzerland. I was a PhD intern at Microsoft Research and worked on recommendation of hybrid physical designs (B+ tree and Columnstores) for SQL Server. I also had internships at CERN (Geneva, Switzerland), Barclays Investment Bank in London (UK), Microsoft Research (Redmond, USA) and Google (Madison, USA). I spend the rest of my waking hours on reading books, taking MOOC courses, especially on Machine & Deep Learning, enjoying many sports, playing a guitar, and practicing martial-arts.

Publications


2021

  1. CaPC Learning: Confidential and Private Collaborative Learning
    Christopher A. Choquette-Choo, Natalie Dullerud, Adam Dziedzic, Yunxiang Zhang, Somesh Jha, Nicolas Papernot, Xiao Wang
    In International Conference on Learning Representations

    Slides Video Code Blog Post

    Machine learning benefits from large training datasets, which may not always be possible to collect by any single entity, especially when using privacy-sensitive data. In many contexts, such as healthcare and finance, separate parties may wish to collaborate and learn from each other’s data but are prevented from doing so due to privacy regulations. Some regulations prevent explicit sharing of data between parties by joining datasets in a central location (confidentiality). Others also limit implicit sharing of data, e.g., through model predictions (privacy). There is currently no method that enables machine learning in such a setting, where both confidentiality and privacy need to be preserved, to prevent both explicit and implicit sharing of data. Federated learning only provides confidentiality, not privacy, since gradients shared still contain private information. Differentially private learning assumes unreasonably large datasets. Furthermore, both of these learning paradigms produce a central model whose architecture was previously agreed upon by all parties rather than enabling collaborative learning where each party learns and improves their own local model. We introduce Confidential and Private Collaborative (CaPC) learning, the first method provably achieving both confidentiality and privacy in a collaborative setting. We leverage secure multi-party computation (MPC), homomorphic encryption (HE), and other techniques in combination with privately aggregated teacher models. We demonstrate how CaPC allows participants to collaborate without having to explicitly join their training sets or train a central model. Each party is able to improve the accuracy and fairness of their model, even in settings where each party has a model that performs well on their own dataset or when datasets are not IID and model architectures are heterogeneous across parties.
    @inproceedings{capc2021iclr,
      title = {CaPC Learning: Confidential and Private Collaborative Learning},
      author = {Choquette-Choo, Christopher A. and Dullerud, Natalie and Dziedzic, Adam and Zhang, Yunxiang and Jha, Somesh and Papernot, Nicolas and Wang, Xiao},
      booktitle = {International Conference on Learning Representations},
      year = {2021},
      url = {https://openreview.net/forum?id=h2EbJ4_wMVq}
    }
    
  2. Preoperative paraspinal neck muscle characteristics predict early onset adjacent segment degeneration in anterior cervical fusion patients: A machine-learning modeling analysis
    Arnold Y. L. Wong, Garrett Harada, Remy Lee, Sapan D. Gandhi, Adam Dziedzic, Alejandro Espinoza-Orias, Mohamad Parnianpour, Philip K. Louie, Bryce Basques, Howard S. An, Dino Samartzis
    Journal of Orthopaedic Research

    Abstract Early onset adjacent segment degeneration (ASD) can be found within six months after anterior cervical discectomy and fusion (ACDF). Deficits in deep paraspinal neck muscles may be related to early onset ASD. This study aimed to determine whether the morphometry of preoperative deep neck muscles (multifidus and semispinalis cervicis) predicted early onset ASD in patients with ACDF. Thirty-two cases of early onset ASD after a two-level ACDF and 30 matched non-ASD cases were identified from a large-scale cohort. The preoperative total cross-sectional area (CSA) of bilateral deep neck muscles and the lean muscle CSAs from C3 to C7 levels were measured manually on T2-weighted magnetic resonance imaging. Paraspinal muscle CSA asymmetry at each level was calculated. A support vector machine (SVM) algorithm was used to identify demographic, radiographic, and/or muscle parameters that predicted proximal/distal ASD development. No significant between-group differences in demographic or preoperative radiographic data were noted (mean age: 52.4 ± 10.9 years). ACDFs comprised C3 to C5 (n = 9), C4 to C6 (n = 20), and C5 to C7 (n = 32) cases. Eighteen, eight, and six patients had proximal, distal, or both ASD, respectively. The SVM model achieved high accuracy (96.7%) and an area under the curve (AUC = 0.97) for predicting early onset ASD. Asymmetry of fat at C5 (coefficient: 0.06), and standardized measures of C7 lean (coefficient: 0.05) and total CSA measures (coefficient: 0.05) were the strongest predictors of early onset ASD. This is the first study to show that preoperative deep neck muscle CSA, composition, and asymmetry at C5 to C7 independently predicted postoperative early onset ASD in patients with ACDF. Paraspinal muscle assessments are recommended to identify high-risk patients for personalized intervention.
    @article{wong2021ML,
      author = {Wong, Arnold Y. L. and Harada, Garrett and Lee, Remy and Gandhi, Sapan D. and Dziedzic, Adam and Espinoza-Orias, Alejandro and Parnianpour, Mohamad and Louie, Philip K. and Basques, Bryce and An, Howard S. and Samartzis, Dino},
      title = {Preoperative paraspinal neck muscle characteristics predict early onset adjacent segment degeneration in anterior cervical fusion patients: A machine-learning modeling analysis},
      journal = {Journal of Orthopaedic Research},
      volume = {39},
      number = {8},
      pages = {1732-1744},
      keywords = {adjacent segment, cervical, degeneration, disc, disease, muscles, paraspinal, spine},
      doi = {https://doi.org/10.1002/jor.24829},
      url = {https://onlinelibrary.wiley.com/doi/abs/10.1002/jor.24829},
      eprint = {https://onlinelibrary.wiley.com/doi/pdf/10.1002/jor.24829},
      year = {2021}
    }
    
  3. On the Exploitability of Audio Machine Learning Pipelines to Surreptitious Adversarial Examples
    Adelin Travers, Lorna Licollari, Guanghan Wang, Varun Chandrasekaran, Adam Dziedzic, David Lie, Nicolas Papernot

    @misc{travers2021exploitability,
      title = {On the Exploitability of Audio Machine Learning Pipelines to Surreptitious Adversarial Examples},
      author = {Travers, Adelin and Licollari, Lorna and Wang, Guanghan and Chandrasekaran, Varun and Dziedzic, Adam and Lie, David and Papernot, Nicolas},
      year = {2021},
      eprint = {2108.02010},
      archiveprefix = {arXiv},
      primaryclass = {cs.SD}
    }
    

2020

  1. Pretrained Transformers Improve Out-of-Distribution Robustness
    Dan Hendrycks, Xiaoyuan Liu, Eric Wallace, Adam Dziedzic, Rishabh Krishnan, Dawn Song
    In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics

    Although pretrained Transformers such as BERT achieve high accuracy on in-distribution examples, do they generalize to new distributions? We systematically measure out-of-distribution (OOD) generalization for seven NLP datasets by constructing a new robustness benchmark with realistic distribution shifts. We measure the generalization of previous models including bag-of-words models, ConvNets, and LSTMs, and we show that pretrained Transformers’ performance declines are substantially smaller. Pretrained transformers are also more effective at detecting anomalous or OOD examples, while many previous models are frequently worse than chance. We examine which factors affect robustness, finding that larger models are not necessarily more robust, distillation can be harmful, and more diverse pretraining data can enhance robustness. Finally, we show where future work can improve OOD robustness.
    @inproceedings{hendrycks-etal-2020-pretrained,
      title = {Pretrained Transformers Improve Out-of-Distribution Robustness},
      author = {Hendrycks, Dan and Liu, Xiaoyuan and Wallace, Eric and Dziedzic, Adam and Krishnan, Rishabh and Song, Dawn},
      booktitle = {Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics},
      month = jul,
      year = {2020},
      address = {Online},
      publisher = {Association for Computational Linguistics},
      url = {https://aclanthology.org/2020.acl-main.244},
      doi = {10.18653/v1/2020.acl-main.244},
      pages = {2744--2751}
    }
    
  2. Machine Learning based detection of multiple Wi-Fi BSSs for LTE-U CSAT
    Vanlin Sathya, Adam Dziedzic, Monisha Ghosh, Sanjay Krishnan
    In International Conference on Computing, Networking and Communications (ICNC 2020)

    @inproceedings{sathya2020machine,
      title = {Machine Learning based detection of multiple Wi-Fi BSSs for LTE-U CSAT},
      author = {Sathya, Vanlin and Dziedzic, Adam and Ghosh, Monisha and Krishnan, Sanjay},
      booktitle = {International Conference on Computing, Networking and Communications (ICNC 2020)},
      year = {2020},
      organization = {IEEE}
    }
    
  3. An Empirical Evaluation of Perturbation-based Defenses
    Adam Dziedzic, Sanjay Krishnan
    arXiv preprint arXiv:2002.03080 Preprint

    Paper

    @article{dziedzic2020empirical,
      title = {An Empirical Evaluation of Perturbation-based Defenses},
      author = {Dziedzic, Adam and Krishnan, Sanjay},
      journal = {arXiv preprint arXiv:2002.03080},
      year = {2020}
    }
    
  4. Machine Learning enabled Spectrum Sharing in Dense LTE-U/Wi-Fi Coexistence Scenarios
    Adam Dziedzic, Vanlin Sathya, Muhammad Rochman, Monisha Ghosh, Sanjay Krishnan
    IEEE Open Journal of Vehicular Technology

    @article{dziedzic2020machine,
      title = {Machine Learning enabled Spectrum Sharing in Dense LTE-U/Wi-Fi Coexistence Scenarios},
      author = {Dziedzic, Adam and Sathya, Vanlin and Rochman, Muhammad and Ghosh, Monisha and Krishnan, Sanjay},
      journal = {IEEE Open Journal of Vehicular Technology},
      year = {2020},
      publisher = {IEEE}
    }
    
  5. Input and Model Compression for Adaptive and Robust Neural Networks
    Adam Dziedzic
    Thesis

    @article{dziedzic2020input,
      title = {Input and Model Compression for Adaptive and Robust Neural Networks},
      author = {Dziedzic, Adam},
      year = {2020},
      publisher = {The University of Chicago}
    }
    

2019

  1. Band-limited Training and Inference for Convolutional Neural Networks
    Adam Dziedzic, Ioannis Paparizzos, Sanjay Krishnan, Aaron Elmore, Michael Franklin
    In ICML (International Conference on Machine Learning)

    @inproceedings{dziedzic2019band,
      title = {Band-limited Training and Inference for Convolutional Neural Networks},
      author = {Dziedzic, Adam and Paparizzos, Ioannis and Krishnan, Sanjay and Elmore, Aaron and Franklin, Michael},
      booktitle = {ICML (International Conference on Machine Learning)},
      year = {2019}
    }
    
  2. Artificial intelligence in resource-constrained and shared environments
    Sanjay Krishnan, Aaron J Elmore, Michael Franklin, John Paparrizos, Zechao Shang, Adam Dziedzic, Rui Liu
    ACM SIGOPS Operating Systems Review

    @article{krishnan2019artificial,
      title = {Artificial intelligence in resource-constrained and shared environments},
      author = {Krishnan, Sanjay and Elmore, Aaron J and Franklin, Michael and Paparrizos, John and Shang, Zechao and Dziedzic, Adam and Liu, Rui},
      journal = {ACM SIGOPS Operating Systems Review},
      volume = {53},
      number = {1},
      pages = {1--6},
      year = {2019},
      publisher = {ACM New York, NY, USA}
    }
    
  3. A Perturbation Analysis of Input Transformations for Adversarial Attacks
    Adam Dziedzic, Sanjay Krishnan

    @article{dziedzic2019perturbation,
      title = {A Perturbation Analysis of Input Transformations for Adversarial Attacks},
      author = {Dziedzic, Adam and Krishnan, Sanjay},
      year = {2019}
    }
    

2018

  1. Columnstore and B+ tree-Are Hybrid Physical Designs Important?
    Adam Dziedzic, Jingjing Wang, Sudipto Das, Bolin Ding, Vivek R Narasayya, Manoj Syamala
    In Proceedings of the 2018 International Conference on Management of Data (SIGMOD 2018)

    Commercial DBMSs, such as Microsoft SQL Server, cater to diverse workloads including transaction processing, decision support, and operational analytics. They also support variety in physical design structures such as B+ tree and columnstore. The benefits of B+tree for OLTP workloads and columnstore for decision support workloads are well-understood. However, the importance of hybrid physical designs, consisting of both columnstore and B+ tree indexes on the same database, is not well-studied — a focus of this paper. We first quantify the trade-offs using carefully-crafted micro-benchmarks. This micro-benchmarking indicates that hybrid physical designs can result in orders of magnitude better performance depending on the workload. For complex real-world applications, choosing an appropriate combination of columnstore and B+ tree indexes for a database workload is challenging. We extend the Database Engine Tuning Advisor for Microsoft SQL Server to recommend a suitable combination of B+ tree and columnstore indexes for a given workload. Through extensive experiments using industry-standard benchmarks and several real-world customer workloads, we quantify how a physical design tool capable of recommending hybrid physical designs can result in orders of magnitude better execution costs compared to approaches that rely either on columnstore-only or B+ tree-only designs.
    @inproceedings{dziedzic2018columnstore,
      title = {Columnstore and B+ tree-Are Hybrid Physical Designs Important?},
      author = {Dziedzic, Adam and Wang, Jingjing and Das, Sudipto and Ding, Bolin and Narasayya, Vivek R and Syamala, Manoj},
      booktitle = {Proceedings of the 2018 International Conference on Management of Data (SIGMOD 2018)},
      pages = {177--190},
      year = {2018},
      organization = {ACM}
    }
    
  2. Deeplens: Towards a visual data management system
    Sanjay Krishnan, Adam Dziedzic, Aaron J Elmore
    CIDR

    @article{krishnan2018deeplens,
      title = {Deeplens: Towards a visual data management system},
      author = {Krishnan, Sanjay and Dziedzic, Adam and Elmore, Aaron J},
      journal = {CIDR},
      year = {2018}
    }
    

2017

  1. Demonstrating the BigDAWG Polystore System for Ocean Metagenomics Analysis.
    Tim Mattson, Vijay Gadepally, Zuohao She, Adam Dziedzic, Jeff Parkhurst
    In CIDR

    @inproceedings{mattson2017demonstrating,
      title = {Demonstrating the BigDAWG Polystore System for Ocean Metagenomics Analysis.},
      author = {Mattson, Tim and Gadepally, Vijay and She, Zuohao and Dziedzic, Adam and Parkhurst, Jeff},
      booktitle = {CIDR},
      year = {2017}
    }
    
  2. Bigdawg polystore release and demonstration
    Kyle OBrien, Vijay Gadepally, Jennie Duggan, Adam Dziedzic, Aaron Elmore, Jeremy Kepner, Samuel Madden, Tim Mattson, Zuohao She, Michael Stonebraker
    arXiv preprint arXiv:1701.05799 Preprint

    Paper

    @article{obrien2017bigdawg,
      title = {Bigdawg polystore release and demonstration},
      author = {OBrien, Kyle and Gadepally, Vijay and Duggan, Jennie and Dziedzic, Adam and Elmore, Aaron and Kepner, Jeremy and Madden, Samuel and Mattson, Tim and She, Zuohao and Stonebraker, Michael},
      journal = {arXiv preprint arXiv:1701.05799},
      year = {2017}
    }
    
  3. Version 0.1 of the bigdawg polystore system
    Vijay Gadepally, Kyle OBrien, Adam Dziedzic, Aaron Elmore, Jeremy Kepner, Samuel Madden, Tim Mattson, Jennie Rogers, Zuohao She, Michael Stonebraker
    arXiv preprint arXiv:1707.00721 Preprint

    Paper

    @article{gadepally2017version,
      title = {Version 0.1 of the bigdawg polystore system},
      author = {Gadepally, Vijay and OBrien, Kyle and Dziedzic, Adam and Elmore, Aaron and Kepner, Jeremy and Madden, Samuel and Mattson, Tim and Rogers, Jennie and She, Zuohao and Stonebraker, Michael},
      journal = {arXiv preprint arXiv:1707.00721},
      year = {2017}
    }
    
  4. BigDAWG version 0.1
    Vijay Gadepally, Kyle O’Brien, Adam Dziedzic, Aaron Elmore, Jeremy Kepner, Samuel Madden, Tim Mattson, Jennie Rogers, Zuohao She, Michael Stonebraker
    In 2017 IEEE High Performance Extreme Computing Conference (HPEC)

    @inproceedings{gadepally2017bigdawg,
      title = {BigDAWG version 0.1},
      author = {Gadepally, Vijay and O'Brien, Kyle and Dziedzic, Adam and Elmore, Aaron and Kepner, Jeremy and Madden, Samuel and Mattson, Tim and Rogers, Jennie and She, Zuohao and Stonebraker, Michael},
      booktitle = {2017 IEEE High Performance Extreme Computing Conference (HPEC)},
      pages = {1--7},
      year = {2017},
      organization = {IEEE}
    }
    
  5. Data Loading, Transformation and Migration for Database Management Systems
    Adam Dziedzic

    @phdthesis{dziedzic2017data,
      title = {Data Loading, Transformation and Migration for Database Management Systems},
      author = {Dziedzic, Adam},
      year = {2017},
      school = {The University of Chicago},
      put_type = {Thesis}
    }
    

2016

  1. Integrating Real-Time and Batch Processing in a Polystore
    John Meehan, Stan Zdonik, Shaobo Tian, Yulong Tian, Nesime Tatbul, Adam Dziedzic, Aaron Elmore
    In HPEC 2016

    @inproceedings{meehan2016integrating,
      title = {Integrating Real-Time and Batch Processing in a Polystore},
      author = {Meehan, John and Zdonik, Stan and Tian, Shaobo and Tian, Yulong and Tatbul, Nesime and Dziedzic, Adam and Elmore, Aaron},
      booktitle = {HPEC 2016},
      year = {2016}
    }
    
  2. Data Transformation and Migration in Polystores
    Adam Dziedzic, Aaron Elmore, Michael Stonebraker
    In HPEC 2016

    @inproceedings{dziedzic2016data,
      title = {Data Transformation and Migration in Polystores},
      author = {Dziedzic, Adam and Elmore, Aaron and Stonebraker, Michael},
      booktitle = {HPEC 2016},
      year = {2016},
      organization = {IEEE}
    }
    
  3. DBMS Data Loading: An Analysis on Modern Hardware
    Adam Dziedzic, Manos Karpathiotakis, Ioannis Alagiannis, Raja Appuswamy, Anastasia Ailamaki
    In ADMS 2016

    @inproceedings{dziedzic2016dbms,
      title = {DBMS Data Loading: An Analysis on Modern Hardware},
      author = {Dziedzic, Adam and Karpathiotakis, Manos and Alagiannis, Ioannis and Appuswamy, Raja and Ailamaki, Anastasia},
      booktitle = {ADMS 2016},
      year = {2016}
    }
    

2015

  1. BigDAWG: a Polystore for Diverse Interactive Applications
    Adam Dziedzic, Jennie Duggan, Aaron J. Elmore, Vijay Gadepally, Michael Stonebraker
    In IEEE Viz Data Systems for Interactive Analysis (DSIA)

    @inproceedings{dziedzic2015bigdawg,
      title = {BigDAWG: a Polystore for Diverse Interactive Applications},
      author = {Dziedzic, Adam and Duggan, Jennie and Elmore, Aaron J. and Gadepally, Vijay and Stonebraker, Michael},
      booktitle = {IEEE Viz Data Systems for Interactive Analysis (DSIA)},
      year = {2015}
    }
    

2014

  1. Analysis and comparison of NoSQL databases with an introduction to consistent references in Big Data storage systems
    Adam Dziedzic, Jan Mulawka
    In Photonics Applications in Astronomy, Communications, Industry, and High-Energy Physics Experiments 2014

    @inproceedings{dziedzic2014analysis,
      title = {Analysis and comparison of NoSQL databases with an introduction to consistent references in Big Data storage systems},
      author = {Dziedzic, Adam and Mulawka, Jan},
      booktitle = {Photonics Applications in Astronomy, Communications, Industry, and High-Energy Physics Experiments 2014},
      volume = {9290},
      pages = {92902V},
      year = {2014},
      organization = {International Society for Optics and Photonics}
    }
    

Projects

Collaborative Learning in ML

Collaborative Learning in ML

Confidential and Private Collaborative (CaPC) learning is the first method provably achieving both confidentiality and privacy in a collaborative setting by using techniques from cryptography and differential privacy literature.

...   ...
paperpaper slidesslides talktalk bibtexbibtex
Band-limited Training and Inference For Convolutional Nerual Networks

Band-limited Training and Inference For Convolutional Nerual Networks

The convolutional layers are core building blocks of neural network architectures. In general, a convolutional filter applies to the entire frequency spectrum of the input data. We explore artificially constraining the frequency spectra of these filters and data, called band-limiting, during training. The frequency domain constraints apply to both the feed-forward and back-propagation steps. Experimentally, we observe that Convolutional Neural Networks (CNNs) are resilient to this compression scheme and results suggest that CNNs learn to leverage lower-frequency components. In particular, we found: (1) band-limited training can effectively control the resource usage (GPU and memory); (2) models trained with band-limited layers retain high prediction accuracy; and (3) requires no modification to existing training algorithms or neural network architectures to use unlike other compression schemes.

...   ...
paperpaper slidesslides talktalk bibtexbibtex
Auto-recommendation of hybrid physical designs

Auto-recommendation of hybrid physical designs

We extend the Database Engine Tuning Advisor for Microsoft SQL Server to recommend a suitable combination of B+ tree and columnstore indexes for a given workload. Through extensive experiments using industry-standard benchmarks and several real-world customer workloads, we quantify how a physical design tool capable of recommending hybrid physical designs can result in orders of magnitude better execution costs compared to approaches that rely either on columnstore-only or B+ tree-only designs.

paperpaper slidesslides bibtexbibtex
BigDAWG

BigDAWG

An open source project from researchers within the Intel Science and Technology Center for Big Data (ISTC). BigDAWG is a reference implementation of a polystore database. A polystore system is any database management system (DBMS) that is built on top of multiple, heterogeneous, integrated storage engines. I worked on the scaffolding of the system and then implemented a cast operator to move data between diverse DBMSs.

...   ...
paperpaper slidesslides bibtexbibtex
Data Loading

Data Loading

An automated testing infrastructure was built to benchmark the loading performance of several commercial and open-source databases, perform an in-depth analysis to identify bottlenecks of the data loading process and investigate novel techniques which could be used to accelerate DBMS data loading.

paperpaper slidesslides bibtexbibtex

Experience

Vector Institute

Research on machine and deep learning

September 2020 - current
Postdoctoral researcher

University of Toronto

Research on machine and deep learning

September 2020 - current
Visiting researcher

University of Chicago

Research on the intersection of machine/deep learning and database management systems (DBMSs).

July 2015 - August 2020
PhD student

Google

Eliminated a performance cliff in the F1 database for the aggregation queries.

June - September 2017
PhD Software Engineering Intern at Data Infrastructure and Analysis team

Microsoft Research

Carried out research on hybrid physical designs for diverse workloads.

March - June 2015
Research Intern at Data Management, Exploration and Mining (DMX) group

EPFL

Research on data loading to diverse database management systems (DBMSs).

October 2014 - June 2015
Research Intern

Warsaw University of Technology

Student

October 2007 - September 2014
Obtained Bachelor and Master' degrees. Worked on research projects.

Barclays Investment Bank

Created a system for validating and suggesting underlyings for complex financial products.

June - August 2013
Intern Analyst

CERN

Developed a system to store information on configuration and management of non-host devices at CERN Computer Centre.

April - December 2012
Technical Student at IT Department

Mobile Startup

Worked on an app providing aspects of music social interactions.

March 2012
Udarnik

Technical University of Denmark

Student

August 2010 - January 2011
Erasmus program (took courses on logic programming, applied statistics, web 2.0 and mobile interactions, spatial databases, java programming).

Tekten

Designed a database and developed application for a telecom company in Java and PL/SQL.

July 2010
Designer and Software Engineer

Torn

Worked on a financial and accounting system project.

July - September 2009
Software Engineer

Skills

Contact

Adam Dziedzic

Postdoctoral Fellow at the Vector Institute and the University of Toronto.

Office: Vector Institute, Toronto

The best way to contact me is through email.

© 2021 Adam Dziedzic