Yes. Data would be normalized feature by feature as it would not make sense to divide something in units of C by something in units of lux.
If you are building this using the Neural Network Toolbox this is done automatically for you by mapping the data of each feature to the range [-1,1] using the mapminmax function. Similarly this is also done for the targets at the output layer.
That being said, if you are normalizing them 1 at a time, you can do this using vectorized functions. If you did have "outliers" in your data then the zscore function may be a more appropriate form of normalization.