In order to fulfill economical as well as ecological boundary conditions information technologies and sensor are increasingly gaining importance in horticulture. In combination with the reduced availability of human workers automation technologies thus play a key role in the international competition in vinicultures and orchards and have the potential to reduce the costs as well as environmental impacts. The authors are working in the fields of unmanned or remote controlled autonomous field robots, navigation, image-based sensor fusion as well as agricultural applications. In particular field robots have been applied for a few years in outdoor agricultural field applications. Within an interdisciplinary research group these technologies are transferred to robot applications in vineyards and orchards. The goal is the availability of an autonomous service robot, whereas first applications are site-specific plant protection (e.g. precise spraying), mulching and picking up fruit boxes. A first version of the robot with electrical drives and precise sprayers has already been developed. The applications, however, show a large range of field conditions which have to be considered for the vehicle application design. Thus the authors have developed a 3D simulation environment which allows the virtual test of the robot platform prior to its application. Moreover, the software algorithms can be directly transferred to the robot and thus allow iterative optimizations of the development process. The generation and first applications of the 3D simulation environment of multi sensor-based navigation and applications in vinicultures and orchard is the focus of this work. Robot Operating System (ROS) has been chosen as software framework for integrating the autonomous vehicle, the sensors and the environment for navigation and application processes. ROS supplies the 3D simulation environment Gazebo using physical engines (e.g. ODE – Open Dynamic Engine) in order to simulate the robots behavior as close as possible to reality. Moreover, the software tool Rviz is used for visualization of the sensor data (as for example) for optimization of navigation algorithms. Since the navigation in vine and fruit rows, the various applications as well as safety issues require sensor based solutions. The navigation itself is performed by image based sensors, since GPS based systems do not fulfill the requested functionality. In order to compensate for varying selectivities of different sensors, concepts of sensor fusion are applied. Sensor data in ROS is exchanged by so called messages, which can easily be logged to a database. For processing this data ROS integrated tools like NumPy (matrices and mathematical function) or OpenCV (image processing) are used. An interface from the database to MATLAB is also a powerful tool for evaluating the sensor data offline and testing first algorithms. In practice color cameras (for documentation purposes), 3D cameras, laser range finders as well as ultrasonic multi reflectance sensors are used. In addition a priori data (such as maps or row distances) or GPS sensor information can be included and thereby increase the robustness of the navigation or the safety level. Within ROS plugins for different sensors have been generated (color camera, 2D laser scanner Sick LMS511; 3D laser scanner Nippon FX-8; ToF camera Mesa SR4500). Together with environmental data of crop plants (or obstacles) the robot behavior with respect to the navigation and the application can be evaluated prior to field tests. As for example the leaf wall area for controlling precise sprayer can be virtually measured and the reduction of chemicals can be evaluated. ROS enables the usage of the same control software for the simulation and the hardware (robot, actuators), thereby strongly reducing the development times. As a result the simulation environment has been developed and the results of first reactive row navigation algorithms are evaluated and compared to dynamic tests with real robots.