Question: How would I go about making a persistent player....
See original GitHub issueHi again,
Sorry if I’m blowing your issues up, but this is a really nice package, and I’m curious if it might be able to handle my use case.
I have made this app:
rykr.netlify.com
Which is a blog and music player in one, with a persistent player on the bottom of the screen at all times. I am trying to run all audio though it, but I currently need to define a set of files for the player to play at one time, like so:
const { tracks, assets } = useStaticQuery(graphql`
query Tracks {
tracks: allMdx(filter: { fileAbsolutePath: { regex: "/content/music/" } }
) {
edges {
node {
fields {
slug
}
frontmatter {
name
artist
genre
bpm
# ADD BASE HERE
artwork {
base
childImageSharp {
fluid(maxWidth: 1000) {
...GatsbyImageSharpFluid
}
}
}
alt
description
release(formatString: "MMMM Do, YYYY")
audio {
absolutePath
base
}
}
}
}
}
# query all mp3 and jpg files from /content/music/
assets: allFile(
filter: {
extension: { in: ["mp3", "jpg"] }
absolutePath: { regex: "/content/music/" }
}
) {
edges {
node {
publicURL
relativePath
}
}
}
}
`)
// This uses the map function to create an array of objects, where each object is a track and all it's information and assets. We need to use `useMemo` to avoid re-computation when the state changes as this information is all static.
const trackList = useMemo(
() =>
tracks.edges.map(track => {
const slug = track.node.fields.slug
const { frontmatter } = track.node
const {
name,
artist,
genre,
bpm,
artwork,
alt,
description,
audio,
} = frontmatter
return { slug, name, artist, genre, bpm, artwork, alt, description, audio }
}),
[tracks]
)
…so I am using a graphql staticQuery
in gatsby to generate an array of tracks, which I then pull in to the player and it works ok, like so:
const [state, setState] = useState({
audioPlayer: {},
currentTrackIndex: null,
isPlaying: false,
})
const [currentTime, setCurrentTime] = useState(state.audioPlayer.currentTime)
useEffect(() => {
setState({
audioPlayer: new Audio(),
currentTrackIndex: null,
isPlaying: false,
})
}, [])
const formattedTime = getTime(currentTime) || `0:00`
const formattedDuration = getTime(state.audioPlayer.duration) || `0:00`
// etc, etc etc...
…but what I really want is to have pieces of audio on many different pages in the app, and all of the audio everywhere is connected to the audio context and can be played, and when it is played it is then added to the array, and this would include full songs, audio snippets, sound effects, examples, etc…all on different pages, but all being sent through one audio context.
I have no idea who to go about doing this, but was curious if you might have any ideas about how this might be possible to do in a react and/or gatsby app, or even just in general.
This is more a discussion, I guess, then any specific issue, just want to get your thoughts, if you have the time and interest to share them 😃
Issue Analytics
- State:
- Created 4 years ago
- Comments:12 (6 by maintainers)
Top GitHub Comments
Cool, I’m tracking what you are saying. I will publish a roadmap this week detailing the features I plan on rolling out. Supporting multiple audio sources being loaded at once is currently not supported. Only one audio source can be loaded at a time. This is why I load the new audio (“let’s hear em” button in the GlobalPlayer example) when the user wants to start listening to it instead of what is already playing. Luckily the time to load the new audio in this setup is pretty quick - almost instantaneous, but in a more realistic environment there would probably be a long enough delay to hurt the user’s experience who wants to listen to their song the moment they click the mouse.
EDIT: here is the basic roadmap I created: https://github.com/E-Kuerschner/useAudioPlayer/projects/1
Yeah that is expected behavior given the way it is currently implemented. What is happening is when you visit the cat page the
Cat
component is rendered for the first time and this line of code runs:useAudioPlayer('cats.mp3')
This causes the AudioPlayerContext to immediately load this file (and since it was previously playing a sound it will autoplay once it finishes loading).What you want is a way to lazily load the audio at a later point in time. I do plan on building that in to the API so that it is a little more easier to achieve but for now it should be possible by setting things up a little differently:
Basically you would have to have a component to manage loading audio files by changing src config in
useAudioPlayer
.