Wow!
What kind of equipment are you using?
I've always heard that in astrophotography, a lot of the colors are "made up", does that happen here? Or is that nebula "really" green?
In some cases the colors aren't really made up as much as "inspired by real events". Sort of. This is simplified enough so that my brain can understand it.....
A few things to note. There's a lot of gas and dust out there. Some of that gas gets enough energy from a nearby star to emit its own light. Imagine a neon light. Neon is a gas and when you run electricity through the gas it throws off photons. Hence, neon lights. The same idea happens out in space.
Okay, different gasses emit different wavelengths of light. Based on this fact, if you were to put a filter in front of a camera and it only allowed a very narrowband of light to pass through to the camera you would be able to test for the presence of certain gasses. If you had a filter that only allowed light at the frequency of 672nm then any light you get is produced by Sulphur. If you get light at 656nm you get Hydrogen and light at 500nm is Oxygen.
When the hubble telescope was launched it was fitted with a monochrome camera. It only takes photos in black and white. Different filters are placed in front of the camera so that the team could search for various gasses based on the emission of the light the camera captured.
As you probably know most of the color we see on TV and on computers and in photographs etc. is actually various combinations of Red, Green and Blue. Rather than looking at three different photos of the same object, the Hubble team combined three black and white images into a single photo and they mapped these frequencies to Red, Green and Blue. The convention the Hubble team used is that Sulphur would be mapped to Red, Hydrogen would be mapped to Green and Oxygen would be mapped to Blue. Then, you could look at one color image and determine a lot about its makeup based on the colors. This is called SHO imaging. A lot of the really beautiful Hubble telescope images you see are photos taken with narrowband filters. Sulphur got mapped to red because its frequency is closest to red in the color frequencies we see. Hydrogen is mapped to green (mostly because red was already taken by Sulphur) and Oxygen was mapped to blue becuase its frequency is close to blue. You'll notice that sulphur and hydrogen are pretty close together so they're both kind of shades of red, but to get a pretty photo you need someone to be green and hydrogen was chosen. I'll play with SHO as well as HSO and HOO color combinations.
The above description works well for targets that emit their own light like emission nebulae. But, LOTS of targets don't emit their own light; they reflect white light. For these reflection targets it's better to NOT use narrowband filters, but simply capture as much visible light as possible. For these types of targets you can use a "regular" color camera.
The first, second and fourth images I posted are emission nebulae. I photographed these with narrowband filters and played with the processing of the colors until I was happy. The general consensus in AP is that narrowband images are kind of "fair game" for artistic interpretation. My primary goal is to make pretty pictures rather than determine the composition of nebulae, so I make choices based on my opinion of beauty.
The third and fifth images I posed are reflection nebulae. These were taken with "regular" broadband light collection that is exactly like you would get from a color camera. Again, I make processing choices to enhance the beauty of what is captured rather than trying to create a scientifically derived photo.
Finally, a lot of people ask the question, "If I took a space ship to this location, is that what I would see if I looked out the window?" No. This is because our eyes capture light that our brain can interpret very differently tan a camera captures light and creates an image that our brain can interpret.