Close
0%
0%

Open-Source Neuroscience Hardware Hack Chat

It isn't brain surgery. Until it is.

Wednesday, February 19, 2020 12:00 pm PST Local time zone:
Hack Chat
Similar projects worth following

Lex Kravitz and Mark Laubach will host the Hack Chat on Wednesday, February 19, 2020 at noon Pacific Time.

Time zones got you down? Here's a handy time converter!

Join Hack ChatThere was a time when our planet still held mysteries, and pith-helmeted or fur-wrapped explorers could sally forth and boldly explore strange places for what they were convinced was the first time. But with every mountain climbed, every depth plunged, and every desert crossed, fewer and fewer places remained to be explored, until today there's really nothing left to discover.

Unless, of course, you look inward to the most wonderfully complex structure ever found: the brain. In humans, the 86 billion neurons contained within our skulls make trillions of connections with each other, weaving the unfathomably intricate pattern of electrochemical circuits that make you, you. Wonders abound there, and anyone seeing something new in the space between our ears really is laying eyes on it for the first time.

But the brain is a difficult place to explore, and specialized tools are needed to learn its secrets. Lex Kravitz, from Washington University, and Mark Laubach, from American University, are neuroscientists who've learned that sometimes you have to invent the tools of the trade on the fly. While exploring topics as wide ranging as obesity, addiction, executive control, and decision making, they've come up with everything from simple jigs for brain sectioning to full feeding systems for rodent cages. They incorporate microcontrollers, IoT, and tons of 3D-printing to build what they need to get the job done, and they share these designs on OpenBehavior, a collaborative space for the open-source neuroscience community.

Join us for the Open-Source Neuroscience Hardware Hack Chat this week where we'll discuss the exploration of the real final frontier, and find out what it takes to invent the tools before you get to use them.

  • Hack Chat Transcript, Part 3

    Dan Maloney02/19/2020 at 21:06 0 comments


    Andre Maia Chagas12:58 PM
    from there move to R/python, arduinos and version control

    Lex Kravitz12:58 PM
    @Dan Maloney OK!

    Mark Laubach12:59 PM
    @Andre Maia Chagas yes; we cover rpy2, oct2py, Arduinos, and Git

    Andre Maia Chagas1:00 PM
    @Mark Laubach do you have material available to share? it would be good to have a starting point, and a place to contribute with things we develop

    Thomas Shaddack1:00 PM
    @Lex Kravitz i am recently pretty deeply involved with 3d printing by SLA, from liquid resins. not only you get high precision, but also you can easily add things to the resins to modify their properties. i saw some tests with crystal violet in latex paints to make them self-disinfecting at light, and it dissolves well in acrylate resins too; no resources here to test the self-sterilization though.

    DrG1:00 PM
    @Lex Kravitz Working in a Psychiatry Dept, are you involved in any preclinical drug efforts? If so, how has any open behavior "task" been used in the effort [ @Mark Laubach too].

    Thomas Shaddack1:00 PM
    @Lex Kravitz 20 micrometer thin layers are easy to achieve.

    Mark Laubach1:01 PM
    @Andre Maia Chagas sort of... the course has changed over the years and its been a goal to get the content out; challenge is I am on sabbatical next year, so maybe I will finally find time to get that together and shared; I am open to sharing messy stuff any time though :-)

    So that was a super-fast hour, and we'll have to let Mark and Lex get back to work. Feel free to continue the discussion, though - this was really fascinating stuff. I really want to thank both Lex and Mark for the time and the great discussion.

    Mark Laubach1:01 PM
    @Dan Maloney and everyone: Thank so much! This was a blast.

    Lex Kravitz1:02 PM
    @Thomas Shaddack That's awesome! We have an SLA printer but I'm afraid to mess with it too much because it was expensive and I don't want to void the warranty - the resins all come with RFID tags to stop us from playing with other resins... basically the opposite of open source. I want to get another one for playing around with

    Andre Maia Chagas1:02 PM
    @Mark Laubach cool, thanks! I'll shoot you a message later about this...

    I've been sharing things a bit "aggressively" on github.com/sussex-neuroscience

    mrdale19581:02 PM
    It was drinking from a firehose! Thanks!

    Andre Maia Chagas1:02 PM
    I'll put some stuff for the bootcamp there too

    Mark Laubach1:02 PM
    @Andre Maia Chagas lets stay in touch about this

    Lex Kravitz1:02 PM
    @Dan Maloney Thanks for inviting us!! This was a blast - I have to get back to work but anyone please feel free to reach out via messaging on here if you have other questions!

    And don't forget that a transcript is forthcoming. I'll post the link here when it's done.

    Thomas Shaddack1:02 PM
    @Lex Kravitz my level of involvement is mixing the resins from precursors. working with the cheapest kind of printer, the anycubic photon class.

    Mark Laubach1:03 PM
    @Dan Maloney Thanks!

  • Hack Chat Transcript, Part 2

    Dan Maloney02/19/2020 at 21:05 0 comments

    Thomas Shaddack12:31 PM
    what about electromyography? could it be more useful for some uses than the notoriously fickle eeg?

    Charlie Lindahl12:31 PM
    Which resulted in a product https://eyedrivomatic.org/

    Andre Maia Chagas12:31 PM
    @Thomas here is an example paper where OSH does better than commercial tools https://www.nature.com/articles/s41598-017-02301-2

    Mark Laubach12:31 PM
    @Thomas reproducible results have started to come out of some shared projects, especially ones like DeepLabCut and the Open Miniscope. Many labs are using.

    Charlie Lindahl12:31 PM
    I'm thinking openbci could augment this work.

    Thomas Shaddack12:32 PM
    ...random thought for communication. eyeblink sensing, one for each eye. left-eye blink is dash, right-eye dot, both is end of symbol or ignored.

    Thomas Shaddack12:32 PM
    could be a faster alternative to gaze-controlled keyboards.

    Leonardo Gomes12:32 PM
    Hi guys, I'm Leonardo Gomes from Brazil and, only to give an example of BCI application, I'm working with EEG for controlling sex toys and helping people with disabilities

    https://www.mdpi.com/2218-6581/7/3/46 in this first version I used an Emotiv EPOC, but now I'm developing my own BCI based on the OPENBCI project.

    Chuck Glasser12:32 PM
    Simply tracking an animals behavior is only one part of the problem. It's like tracking the movement of a steering wheel of a car without knowing if the car is traveling down a road under the control of a driver, or is it a 3 year old child playing with the wheel as it is parked in the garage? Context is everything. Measuring the EEG of a rat in a box is not that hard. However that alone is not useful. Tracking movement AND the EEG, now that is something valuable.

    RichardCollins12:33 PM
    There is a contract mechanism for NSF where they are asking for people to come up with sensor to monitor ANY indication of the state of the patient. Any data stream is possibly and indicator. They do not classify "behavior" but treat it as sensor fusion and looking for patterns.

    Thomas Shaddack12:33 PM
    thought. can optogenetics be leveraged? something that can be injected to the neurons and make them sensitive to light, or emit light? could work around the issue with electrode stability.

    Thomas12:33 PM
    @Mark Laubach I agree. Peer review like approaches also work for lab equipment

    Lex Kravitz12:33 PM
    @Thomas Shaddack For many situations it may be! The important thing is to measure a good signal, which is where EEG is challenging, to record it well requires a controlled environment, gel electrode contacts, etc. So if EKG can give you what you want then it can definitely be better. And yes, EMG of eyeblink sensing is great! I remember an EEG headband device that incorporated that as well, imo that part probably worked much better than the EEG sensing.

    Lex Kravitz12:34 PM
    @Leonardo Gomes Welcome Leonardo!

    Leonardo Gomes12:34 PM
    Thank you!

    Mark Laubach12:34 PM
    welcome!

    Thomas Shaddack12:34 PM
    are there major differences between emg and eeg amplifiers/electrodes?

    RichardCollins12:34 PM
    You can probably use the methods at OpenAI.com to process the raw signals. They seem to be commercial, but they do share their tools on GitHub

    Thomas Shaddack12:35 PM
    . o O ( Python evaluating the behavior of mice... )

    Mark Laubach12:35 PM
    One thing that we wanted to mention is that if anyone has a project that they have shared, let us know and we can get the word out on it either by a blog post or Tweet, or both. There is a contact form on this page: https://edspace.american.edu/openbehavior/

    Essaamar12:35 PM
    Just use a kalman filter it should do the job

    Lex Kravitz12:35 PM
    @Chuck Glasser Great way to cue-up my next project I wanted to highlight! https://bonsai-rx.org/ Bonsai is an open-source visual programming language made by Goncalo Lopes - it's a bit of a steep learning curve but it allows you to pretty easily capture data from a webcam, perform processing steps (like identifying a mouse in the image and tracking her position), and also record...

    Read more »

  • Hack Chat Transcript, Part 1

    Dan Maloney02/19/2020 at 21:04 0 comments

    Essaamar11:57 AM
    What is the discussion?


    https://hackaday.io/event/169511-open-source-neuroscience-hardware-hack-chat

    HACKADAY

    Open-Source Neuroscience Hardware Hack Chat

    Lex Kravitz and Mark Laubach will host the Hack Chat on Wednesday, February 19, 2020 at noon Pacific Time. Time zones got you down? Here's a handy time converter! Unless, of course, you look inward to the most wonderfully complex structure ever found: the brain.

    Read this on Hackaday

    Mark Laubach11:58 AM
    Open source tools and neuroscience, and the OpenBehavior project

    Essaamar11:58 AM
    Excellent

    Essaamar11:58 AM
    Can everyone introduce themselves please

    Lex Kravitz11:59 AM
    Sure, are we good to get started? I'll kick off with a bit of an intro: My name is Lex Kravitz and I am a professor in the Psychiatry department at Washington University in St Louis. With our co-host, Mark Laubach, we started a website, Openbehavior.org, in 2016, to further open-source methods in behavioral research. In the intervening years, it seems that many people in the scientific community have started using and publishing open-source methods, so we’re here to share a few interesting projects with this community, and also answer any questions about neuroscience, research, life, etc!

    Essaamar11:59 AM
    Im A Essa (discontinued Phd in AI) and software engineering bsc

    Mark Laubach11:59 AM
    I'm Mark Laubach. I am a neurobiologist, with lab at American University in DC. We study how decisions are made in the brain, and use a lot of custom made tech in our research.

    Yes, please, let's start. And welcome Mark and Lex!

    Lex Kravitz12:01 PM
    Thanks! I thought we'd start by sharing our website and highlight some of the more interesting projects that we've covered on it

    Mark Laubach12:01 PM

    https://edspace.american.edu/openbehavior/category/all/

    OPENBEHAVIOR

    Most Recent Archives - OpenBehavior

    Camera Control is an open-source software package written by postdoctoral fellow Gary Kane that allows video to be recorded in sync with behavior. The python GUI and scripts allows investigators to record from multiple imaging source camera feeds with associated timestamps for each frame.

    Read this on OpenBehavior

    Thomas joined  the room.12:02 PM

    Lex Kravitz12:03 PM
    We made a website, openbehavior.org, with the goal of promoting open-source methods in neuroscience. It's something Mark and I do for fun, along with the students in his lab, mainly Sam White and Linda Amarante. Maybe we can start with some youtube videos to demo some cool recent advances in open source neuroscience?

    mrdale1958 joined  the room.12:04 PM

    Lex Kravitz12:04 PM
    One of my favorite open-source tools (that has been hugely influential in behavioral neuroscience over the last year) is DeepLabCut:

    Lex Kravitz12:04 PM

    https://www.youtube.com/watch?v=-SqlNx7wr0w

    YOUTUBE MACKENZIE MATHIS

    Lex Kravitz12:05 PM
    This is a markerless pose-tracking package, that has been used to improve video tracking of mice, flies, and *many* other laboratory applications

    Lex Kravitz12:05 PM

    https://www.youtube.com/watch?v=Mv6wdF9Jt6k

    YOUTUBE MACKENZIE MATHIS

    Thomas12:06 PM
    That's pretty cool

    Wow, that's exactly what I tried to design in my old job 20+ years ago! We wanted to track where a rat was pointing his nose. Machine vision just wasn't up to the task at the time

    Mark Laubach12:06 PM
    @Dan Maloney That was my Master's project in 1990! Did not work so well.

    Lex Kravitz12:07 PM
    Cool! yeah most methods before this used "markers" - a piece of tape or something like that

    Chuck Glasser12:07 PM
    Perhaps you can define the difference between random movement, purposeful movement and a behavior?

    Thomas12:07 PM
    how did you solve that? training etc?

    Charlie Lindahl12:07 PM
    Intro for me (Charlie L aka CyberchuckTX)

    : lifelong geek/nerd/programmer interested in these topics.

    Charlie Lindahl12:08 PM
    BTW, "openbehavior.org" times out when I try to access it ...

    Lex Kravitz12:08 PM
    Beyond being an amazing application,...

    Read more »

View all 3 event logs

Enjoy this event?

Share

Discussions

Interested in attending?

Become a member to follow this event or host your own